PEACH Reality

As part of my second year of studies at University College London, I've enjoyed the opportunity to work on an exciting proof of concept under Project PEACH, a collaborative effort between the UCL Computer Science department and the University College London Hospitals NHS trust (UCLH). The undergrad degree programme has a heavy focus on getting students a form of industry experience through the first two years, so we've been able to enjoy working on applied software engineering projects brought forward by industry clients to solve or explore some problem space through the lens of the software engineering discipline.

Project PEACH emcompasses a range of different sub-projects all targeting the use/modernisation of technology, with an overview available for reading here. The sub-project we got to work on is PEACH Reality, a mixed reality platform for exploration of medical imaging data. During the course of our experimentation and design we have taken the direction of aspiring to deliver a system which can be seamlessly integrated into the workflow of a practitioner needing to make use of the alternative perspectives afforded by mixed reality. To provide a bit of description for the Reality project and what we hope to achieve by the end of the project our team published "First Steps: PEACH Reality".

PEACH Reality: Exploding the 3D model PEACH Reality: Exploding the 3D model (shown in the article)

My role throughout the implementation phase of our proof of concept will be the design & development of the API and its implementation of the holographic patient case outlined in the published article. I expect that the project will be interesting work, as I have not developed a multi-client system before where one of the clients is a standalone mixed reality computing device (in this instance the Microsoft HoloLens). I did manage to have some fun with prototyping the REST client within our HoloLens imaging application though.