We are seeking an interventional image computing researcher to design and translate the next generation of real-time AI-assisted hyperspectral imaging systems for surgical guidance. The postholder, based within the Department of Surgical & Interventional Engineering at King’s College London, will play a key role in a collaborative project with King’s College Hospital and Hypervision Surgical, a recently founded King’s spin-out company. A clinical neurosurgery study has been set up to underpin this collaboration. The successful candidate will work on the resulting neurosurgical data as well as retrospective data. They will also have the opportunity to provide insight on how to best acquire prospective data.
We are seeking an interventional image computing researcher to design and translate the next generation of AI-assisted hyperspectral imaging systems for surgical guidance using quantitative fluorescence. The postholder, based within the Department of Surgical & Interventional Engineering at King’s College London, will play a key role in a collaborative project with King’s College Hospital and work closely with the project’s industrial collaborator Hypervision Surgical, a recently founded King’s spin-out company. A clinical neurosurgery study has been set up to underpin this collaboration. The successful candidate will work on the resulting neurosurgical data as well as controlled phantom data. They will also have the opportunity to provide insight on how to best acquire prospective data.
We are seeking a highly motivated individual to join us and work on FAROS, a European research project dedicated to advancing Functionally Accurate RObotic Surgery, https://h2020faros.eu, in collaboration with KU Leuven, Sorbonne University, Balgrist Hospital and SpineGuard.
Optimal outcomes in oncology surgery are hindered by the difficulty of differentiating between tumour and surrounding tissues during surgery. Real-time hyperspectral imaging (HSI) provides rich high-dimensional intraoperative information that has the potential to significantly improve tissue characterisation and thus benefit patient outcomes. Yet taking full advantage of HSI data in clinical indication performed under binocular guidance (e.g. microsurgery and robotic surgery) poses several methodological challenges which this project aims to address. Real-time HSI sensors are limited in the spatial resolution they can capture. This further impacts the usefulness of such HSI sensors in multi-view capture settings. In this project, we will take advantage of a stereo-vision combination with a high-resolution RGB viewpoint and a HSI viewpoint. The student will develop bespoke learning-based computational approaches to reconstruct high-quality 3D scenes combining the intuitiveness of RGB guidance and the rich semantic information extracted from HSI.
Multi-task learning is common in deep learning, where clear evidence shows that jointly learning correlated tasks can improve on individual performances. Notwithstanding, in reality, many tasks are processed independently. The reasons are manifold:
Join us at the IEEE International Ultrasonics Symposium where CAI4CAI members will present their work.
Christian Baker will be presenting on “Real-Time Ultrasonic Tracking of an Intraoperative Needle Tip with Integrated Fibre-optic Hydrophone” as part of the Tissue Characterization & Real Time Imaging (AM) poster session.