Select Page

Hello!

I'm Robert

I Love UX

Hello!

I'm Robert

I Love UX

Nice to meet you!

I haven’t met you yet, but I can only assume you’re a fantastic person. I’ve put this portfolio together to highlight three of my more recent projects, and some of what I did in each role for you. To keep things easy to scan I’ve kept things short and to the point. So if you have any questions I’d be happy to answer them and walk you through my thinking and the context of my work.

I’m driven by my love for UX, my passion for XR, and my belief in technology to make the world a better place. For my entire work history check out my LinkedIn.

Oct 2021 – Oct 2023

Job / Medical Extended Reality Specialist

Philips – Medical Device

• First Medical Extended Reality (MXR) Specialist hired by Philips globally
• Product Manager responsible for XR product planning and deployment
• Lead design and implementation of XR product visual language and UX
• Independent project management of concurrent development projects
• Instructed global teams inside Philips on how to navigate the XR space
• Applied XR platforms to healthcare training and enablement

2021 – 2023

Project / Medical VR Training

Project Management / UX Design & Research
Project Background

The healthcare industry is confronting a major challenge. The number of surgeons that provide care increases at a fixed-liner rate. The number of patients however increases at an exponential rate. Further, patients are living longer and requiring more and more complex care. There is no single solution to this problem, one element I work on however is training.

The Problem

Training of surgeons is expensive and lacks the ability to easily scale. The current state of the art is 1:1 mentorship. It is somewhat akin to watching someone work on a car, seeing a slide show on how to do it, and then doing it yourself. Current training methods additionally can’t recreate emergencies very well. A cadaver or dummy is unlikely to have a convincing emergency event.

My Role

My work at Philips is to build an ecosystem and technology platform to support VR training. The details of my work remain covered under my NDA but I act primarily as a Product Manager. Drawing from both my technical background and my UX background I work to develop product roadmaps and manage development. I work with both internal and external global teams and stakeholders to build a constellation of products to help push beyond the cutting-edge of medical training into the future.

 

My Impact

• Led the development of VR/AR Surgical Trainers

• Created product specifications based on user research in the field

• Led internal 3D/CAD asset pipeline restructuring

• Managed relationships with OEMs and vendors in the XR space

2022

Project / AR Healthcare Concepts

UX Design / Motion Design
Project Background

You need to define where you are heading if you ever want to reach it. This work was part of a high-level effort within Philips to help define what XR in healthcare might look like within the Philips product stack.

My Role

For this Project, I worked closely with our Chief Medical Officer, Global Director of Communications, and their teams. During the script ideation process, I drew on both my own experiences and ideas as well as the Philips product stack to suggest two ideas. These ideas were then tweaked and refined within the working group. Once the ideas achieve early consensus I created mock-ups, storyboards, and scripts. With approval from the wider team, I delivered the final videos with the support of external agencies. From the first email to the final deliverable this work was completed in just under a month. It was shown on stage at Web Summit 2022.

 

My Impact

• Created the vision for what AR/MR can be in the future of healthcare

• Created wireframes and visual style based on user/field research

• Delivered concept videos in a cooperative environment under a short timeline

• Supported global clinical and communications teams in their work

Mockups

This mockup shows an emergency room waiting area from above as seen by a potential security camera. Over each seated person is a colored box showing their status, either red or green. Next to that box is a small UI block which shows their heart rate and respiration. One person is highlighted in red and the UI shows an expanded UI block. The expanded UI block shows Red in heart rate and blood oxygenation levels. It shows green in respiration and skin temperature. At the top center of the screen as a pull down a red alert box is present with the text “1 Alert – Arrhythmia”

Final

A camera angle slightly bellow eye level shows a group of people waiting in a medical waiting area. One person is highlighted by the augmented reality UI overlay with high heart rate and concerning blood oxygenation levels. On the right side of the screen a large UI panel shows more detailed information including the patient’s vitals graphed, where each value is in a range from ideal to too high or low, and a red colored alert at the top that reads “1 Alert – Arrhythmia”
This image is a mockup of a potential heads-up display for emergency medical services personal responding to a case where an AED was used. It shows the exterior of a large stadium like building with a blue AR marker showing where the AED was deployed in 3D space. The augmented reality heads up display also shows a map of the building, patient vitals taken from the AED, and a video stream taken from the AR headset of the caller.
This image is the final version of a heads-up display for emergency medical services personal. The photo shows an outdoor environment augmented by augmented reality UI elements. These include a timeline of events starting with the 911, the patient’s current vitals as provided by the attached AED, and the first-person AR camera perspective of the 911 caller. Also included is a world space AR UI element guiding the first responder to the location of the AED giving an exact distance.
Stakeholder Input

I gathered input from both potential AED users and first responders for this concept. Their input helped refine the ideal feature sets shown and the information each users is presented with. First responder users were given an event history UI that was created with a stakeholder. AED users had their UI’s slimmed down to reduce cognitive load during a stressful situation.

Mockups

This mockup is very similar to the above images. Instead of an exterior environment this image shows the perspective of a first responder as they arrive on scene. In the image you can see a woman with AR glasses on applying hands only CPR to a older man laying on the ground chest up. They are in a home or apartment setting with a couch behind them and a muted but modern looking color palette. An AED is wired to the man’s chest and lying next to him. The first responder is able to see a timeline of events as reported by 911 and the AED on the left side of their vision and on the right, they can see detailed patient vitals included respiration and heart rate as provided by the sensors in the AED.

Final

The final version of this emergency medical services patient heads up display is very similar to the mockup. Again, the image shows an older man laying on the ground with his shirt open and an AED attached. Kneeling over him is a woman in a dark suit with an AR headset on who was providing care. Similar to the mockup the first responder has a heads-up display showing detailed patient vitals and a now more simplified timeline of events.
Full Video

I worked with a team on this project and you can see their work and excellence on full display throughout the presentation. You can see my work from 0:00 to 4:21 and from 13:53 to 16:02.

May 2020 – Oct 2021

Job / Senior Engineer, UI/UX Developer

Stryker – Medical Device

• Solely responsible for UX of CT scanner and spinal surgery robot platform
• Identified user needs based on field/lab research and stakeholder input
• Production of high and low-fidelity wireframes using Adobe XD
• Performed lab and field research to assess new and existing products
• Collaborated in global efforts to update design system and guidance
• Responsible for FDA 510k usability product testing and compliance

2020 – 2021

Project / Spine Surgery Robot

UX Design / User Research
Project Background

A spinal surgery robot is not a single thing, it is a system of different interwoven hardware and software elements working together. Such a system must work inside an operating room which is itself a system of different interlinked systems all to support the patient. You need to place a screw inside the spinal column with sub-millimeter accuracy while the patient is breathing and their spine is moving. Any gaps in understanding between the users and the system could have life-altering impacts. If an emergency does occur humans and machines must work flawlessly in concert to provide time-sensitive care. There isn’t a lot of room to be wrong.

My Role

They asked me during my interview if I had a strong stomach, I learned later on why that was important for a UX designer. The first thing I did when I joined the team is I went into the field and learned how spine surgery was done. I listened to users from surgeons to nurses so that I could model their needs and interactions through the course of a single case and between them.

I was solely responsible for the usability of the system and in many cases how the product was to work. As a bridge between users, engineering, marketing, and other stakeholders I helped the team to define a great product.

 My Impact

• Created the entire UX flow from scratch including the entire UI

• Supported development teams implementing the UI I had created

• Cooperated with global UX team to align design across products and teams

• Led usability validation labs and conducted field research

• Brokered alignment between diverse sets of stakeholders on product vision

High Fidelity Mockups
This image shows the UI from the Spinal Surgery Robot. The image has a dark high contrast colors and large touch friendly buttons. This specific screen guides a user through connecting cables to specific ports.
This image shows the UI from the Spinal Surgery Robot. The image has a dark high contrast colors and large touch friendly buttons. This specific screen guides a user through calibrating a tool.
Who I worked with

On the Spinal Surgery Robot project, I worked closely with upstream and downstream marketing, clinical representatives, compliance and regulatory teams, software and hardware engineering, project leadership, and end users to balance the needs of all stakeholders.

High Fidelity Mockups
This image shows the UI from the Spinal Surgery Robot. The image has a dark high contrast colors and large touch friendly buttons. This specific screen shows the success of an robot arm setup task.
This image shows the UI from the Spinal Surgery Robot. The image has a dark high contrast colors and large touch friendly buttons. This specific screen shows the process of planning where each screw in the surgery will go.
This image shows the UI from the Spinal Surgery Robot. The image has a dark high contrast colors and large touch friendly buttons. This specific screen shows the process of segmentation and labeling of the spine from a CT scan.
This image shows the UI from the Spinal Surgery Robot. The image has a dark high contrast colors and large touch friendly buttons. This specific screen shows the dashboard of the application showing off the state of the case, plan, and tracked objects.
Key Learnings
01 Your desk isn’t the real use environment

Although not the most exciting the choice of each color was driven by research. The product was designed for use in a sterile operating room. Screens are often draped in dim rooms with harsh spot lights creating strong glare and washed out colors. You would never realize the challenges your users face if you just sat at your desk. Field research is key to any successful UX design.

02 Surgery is a team sport

Surgery is not performed alone, it is performed by a team of experts performing a dizzying array of complex tasks back to back. To be successful you need to design for the whole team and for the whole game, not just a single player. This means integrating a diverse array of perspectives and use conditions

03 You aren’t always your target user

The distance between your team and your users can be a key determining factor in how decision-making can break down. No one on the design or engineering teams was a surgeon or had that experience, as a result, it was counter productive to make decisions that were not guided by data or direct and broadly supported user feedback.

Fun Fact

Modern surgical navigation systems use the same IR camera technology employed in motion capture systems used in entertainment.

2020 – 2021

Project / CT Scanner

UX Design / User Research
Project Background

This CT scanner product was a mobile scanner for use inside the operating room. Its unique product feature was its ability to generate full 3D CT images inside a normal OR instead of a more specialized room. The product had been on the market for a number of years and was in use globally. The scanner was used to treat covid, spinal surgeries, and a variety of other adult and pediatric cases. It was an FDA 510k-cleared product that emitted radiation, any misunderstanding between the user and the system could be disastrous.

My Role

Prior to my joining the team there was no UX designer. No one was even sure who had designed the first UI, only that it no longer made much sense. My task was to re-examine the system and help polish it to a mirror finish. I was in charge of usability broadly which wasn’t just limited to what was on the 800 by 600px screen. I took a holistic view to bring order to chaos. I had to wear a heavy lead vest and radiation badges for user testing but I know a product I worked on is out there helping people.

My Impact

• Redesigned entire UI flow and updated visual design

• Significantly reduced time to complete most UI tasks

• Conducted in-depth lab and field research into product

• Worked with broad team to help define product roadmap

High Fidelity Mockups
This image shows the UI from the CT Scanner project. The screens use high contrast colors and large touch friendly icons to support the 800x600 touch screen interface on the scanner. This specific mockup screen shows the main menu of the scanner.
This image shows the UI from the CT Scanner project. The screens use high contrast colors and large touch friendly icons to support the 800x600 touch screen interface on the scanner. This specific mockup shows the screen for the entry of the patient’s basic information and weight needed for setting the scan dose.
Who I worked with

The input of a diverse array of stakeholders was key to ensuring any design met the needs of all users. This was a product that throughout its lifecycle was touched by a lot of different user groups. I worked with internal stakeholders such as software & hardware engineering, field maintenance, project management, marketing, and sales. I also worked closely with all target users including surgeons, medical physicists, and rad techs.

High Fidelity Mockups
This image shows the UI from the CT Scanner project. The screens use high contrast colors and large touch friendly icons to support the 800x600 touch screen interface on the scanner. This specific mockup shows the screen for selecting how the patient is placed, the area on the patient that will be scanned, and how the scanner will capture the images.
This image shows the UI from the CT Scanner project. The screens use high contrast colors and large touch friendly icons to support the 800x600 touch screen interface on the scanner. This specific mockup shows the settings related to a low dose scout scan before the full scan to ensure correct settings.
This image shows the UI from the CT Scanner project. The screens use high contrast colors and large touch friendly icons to support the 800x600 touch screen interface on the scanner. This specific mockup shows the screen for selecting the quality and type of scan.
This image shows the UI from the CT Scanner project. The screens use high contrast colors and large touch friendly icons to support the 800x600 touch screen interface on the scanner. This specific mockup shows the image after a scan to be reviewed for quality before it is sent to a PACs system.
Key Learnings
01 Test with your real users

Field survey and maintenance data indicated an odd pattern of damage to the system during transport. Lab and field research I conducted traced this to how the system was designed to be moved and the expected body type. Internal hardware engineering average height was 6’1” and majority male, field data indicated a predominately female nurse target user with highly variable heights and upper body strength.

02 If it isn’t attached it will get lost

During lab testing of emergency procedures with real users I uncovered skill gaps. This was traced back to the design expectation that users would always have access to their manuals. Survey data indicated that 0% of current users had easy access to their manuals. Only a small minority even knew where it was. When designing for the worst-case scenario don’t count on things going your way.

03 Idiot Proofing

One of the hard-won lessons the FDA tries to impart is when working to avoid an adverse event the first thing you should do is build a system that prevents it from occurring. The last thing you should try if everything else failed is to tell users what not to do inside a manual. My experience with this project taught me the value of this approach. The example I would use with engineers is box cutters. Modern ceramic box cutters never lose their edge but also aren’t sharp enough to cut skin. To make the best product possible you need to be innovative in your design.

Fun Fact

The system used CUDA acceleration for image reconstruction and systems were either equipped with an Nvidia Geforce 1080 or 2080ti GPU with an x86 intel CPU. It was possible to run Doom on the CT scanner.

I do things outside work too!

I work to give back

I believe XR will change the world one day, but I also think it isn’t quite there yet. We haven’t reached a home dial-up moment or the move from mainframes to personal computers quite yet. That’s why I love showing XR to new people and giving them a chance to interact with something that will define the future. Most recently I had the chance to speak to a small class of community college students about what the future holds and give them some hands-on time. At the start they were either asleep or on their phones, by the end they were bursting with questions and ideas.

This image shows Robert adjusting the Hololens 2 headset on a student's head as another student tries VR behind them
This image shows Robert guiding a user through the use of the Hololens 2 headset as they stand shoulder to sholder
and I keep active with photography

When I get the right subject I love shooting macro of off things that give off amazing color. When the right one comes up on eBay silicon wafers can produce some amazing color. When I’m shooting those however I go for panoramic photography. I enjoy the challenge and the result of trying to capture the entirety of a big scene like a mountaintop.

This image shows the structure of a silicon wafter very close up. Each chip reflects bright colored light going from yellow orange to deep purple colors and many colors in between
This image shows what appears to be an iceberg inside the water in Boston harbor. The iceberg is actually a public art installation and is made of foam. The photo features strong contrast and a muted color palette with the water in front and tall city buildings behind almost entirely blocking the sky.
This macro photo shows the texture of the epoxy imitating sea ice up close. Waves of light white and blue material can be seen reflecting the light in eye catching ways. It is unclear exactly what a view is looking at without prior context.
This image is a very large panorama taken from atop a metal fire tower atop a mountain. You can see the slope of the mountain meet the ground and then reach the horizon all covered in thick dark green trees. Meeting the earth is a vast blue sky that is dark blue before becoming lighter blue in the distance. The sky is filled with but not covered by puffy white clouds casting shadows back on the earth. The photo is meant to cover the entirety of what you could see with your eyes from left to right and convey the vastness of the landscape.

and that’s it, for now

Let’s Chat

mail@robert-gilliam.com

Made with Love!

Robert Gilliam 2023