A chatbot that can asynchronously answer visitor questions about the ARM Institute's previous
projects and basic robotics questions remotely.
Through client conversations, I learned of a need for a seamless way for manufacturers to
learn about robotics on their terms. Additionally, the client continues to produce an overwhelming
amount of
research and projects in the field, limiting even experts from having a full picture.
Through the use of a Large Language Model (LLM), as well as retrieval-augmented generation
(RAG), and a vector database, users can have unprecedented access to our client's knowledge base.
Try it out on this page (bottom right), or on a mockup of the client's site
here.
Figure 1: The knowledge assistant, prototyped with Voiceflow, staged for implementation
Currently access to one client project, roboticscareer.org, is supported. In the short-term future, a candidate could go from no knowledge, to a working understanding and enrollment in a skills training program from one interface.
Figure 2: Next iteration, prototyped with NextJS and Pinecone on Vercel
To assist with user testing, I deployed a more robust version of the digital assistant, with the goal of referencing specific portions of documents. This could be used by ARM Institute staff to support client recommendations.
A card activity kit which is given by members of the ARM Institute. It is pocket-sized
and can
help manufacturers scope out solutions onsite and also facilitate conversations with
operators.
Figure 3: A digital copy of the first deck
These cards have even more beneath the surface. Each pack includes an extended reality (XR)
anchor, QR code,
NFC chip link, and 3D printed holding kit. They are also functional for games! With these additional
touchpoints, our client should have no issue ensuring that published content remains in top
condition, and
facilities all across the region have a new breakroom staple.
While our client does have XR deployments, these are limited to a two-dimensional link
aggregation functionality. Our prototype differs in that it allows the placement of 3D models inside
the
manufacturer's own facility, as opposed to needing the original image anchor inside our client's
lobby.
Figure 4: Current client XR deployment
For environments where cameras are ineffective, or the user's operating system is incompatible
with the spatial featureset, we included the NFC chip in the card carrier. This allows the vast
majority of
mobile devices
to simply tap to receive a URL prompt.
Figure 5: Card testing with the cohort
These analog cards certainly had merits, and I wanted the deliverable to include at least one
offering that wasn't network dependent. This brought me to the idea of using extended reality, that
could augment the cards in some scenarios, but would not be necessary to receive the benefits. To
prototype this, I used transluscent NFC chips that would better convery the mixed reality component, as
well as serve as an image anchor for testing. I also took advantage of NFC's ability to be rewritten,
and dynamically updated cards and their pages for each test.
Figure 6: Adapting and encoding cards on-site
Future work could include a mounting bracket to store cards near relevant equipment. Currently, each
include signifiers for process, training level, and integration points.
Figure 7: 3D printed accompanying holder
Building from the analog cards, I tested extended reality as a way to provide additional utility for education and training.
As a late-breaking feature, I tested this in-house with the Robotics Institute at Carnegie Mellon
University, and adapted the functionality with A-frame, WebXR, and Three.js for use in the deliverable.
Future work includes object recognition, which would allow a user to scan a scene and create new
cards in-situ, similar to my last experiment with the NFC chips. This could allow for training processes
to be self-documenting, and replayable.
While the XR and analog training solutions were in development, I considered current robotics-teaming
trends. Cobots are a form of automation that include humans, allowing for more flexible integration and
precision results.
Figure 8: Fetch the cobot at a user test in the Tepper AI Makerspace
Through site visits and conversations at client workshops, I learned that many manufacturers and future clients already have improvised robotics solutions. It seems that cobots are often a first step to reaching out to the ARM Institute. As a result, I ultimately chose not to pursue this route of prototyping beyond Arduino sensors and datalogging, so as to have a greater impact on the downstream onboarding.
Figure 9: An improvised solution at Atlas Metals
Figure 10: A kiosk pretotype user testing
As a vehicle for the deliverables, I considered what the most impactful device would be for the user experience. While mobile devices are most commonly used during site visits, and headsets would have the most degrees of freedom for extended reality tasks, and desktops are available at almost all manufacturing sites, a kiosk could be accessible to visitors. If developed further, a kiosk could provide an additional touchpoint during client workshops, or perhaps allow for self-guided tours, supporting educational goals.
An early ecosystem proposal included five intial components: a collaborative robot (cobot), extended reality (xr) environment, remote sandbox connection, card set activity, and a chatbot with fuzzy search.
Figure 11: Initial ecosystem
To help validate our assumptions, there were meetings with two client stakeholders - the
Chief Operations
Officer (COO), and a Program Manager (PM). This led to a narrowed down plan for Spring
deliverables,
with more
polish and functionality in each remaining proposal.
However, outside of manufacturing-related prototypes, there is also a selection of data
analysis tools for
planning our site visits and client impact. These can be interacted with
here
and
here,
or below, and show the scope of robotics career hiring and grant-eligible manufacturers. As
'side-quests', these tools
allowed practice for the geospatial features in the summer deliverable,
as well as provide non-negligble technical infrastructure value to the client.
Figures 12 and 13: Our data analysis tools
To guide the ideation and design process, I referenced a variety of activities to help me understand the client's needs and the potential of the deliverables. These included design thinking, end-to-end analysis, co-creation with experts. I'm particularly grateful to Pittsburgh's BSides, Code & Supply, and PyData communities, as well as the Smart Manufacturing Experience conference for their insights.
Figures 14, 15, 16, and 17: Design, ideation, and co-creation
After prototyping, iteration, and research, I synthesized the xRM platform. Leveraging geospatial analysis, data visualization, retrieval-augmented generative AI, and extended reality, xRM brings the best emerging technologies to the difficult, complex problems that the ARM Institute consortium faces as they modernize.
Figure 18: The xRM platform