Man vs Machine: A Doctor's Defence of Medical AI

And the results are in… Med-Gemini, Google’s medical AI model, has shown an astonishing ability in a variety of medical tasks. It’s recently published open paper shows the AI can achieve 91% accuracy in the MedQA (USMLE) dataset.

And perhaps more impressive is Med-Gemini’s utilisation of medical knowledge. Google’s medical AI has pushed the boundaries in correctly identifying chest x-rays normal x-rays and 65% in abnormal cases in certain datasets.

Given time, medical AI has the potential to become more accurate and better at diagnosis than the most accomplished medical specialist.

What does this mean for us mere mortals? The human doctors on the shop floor trying to bully our brains into absorbing the ever-changing guidelines and technologies? Are we now inferior to a machine?

Possibly.

Is this a good thing?

Possibly.

Here goes: I’m pinning my colours to the mast and voicing my support for medical AI. Here are my reasons why I’m optimistic about the rise of the medical supercomputer:

Not all doctors are great

One of the most quoted arguments against medical AI, is that it can’t have the same personal, empathetic touch as a human doctor. But having been part of countless medical teams for the last 20 years, I have seen my fair share of fallible clinicians. Clear-thinking and deportment suffer under stress and the extreme working conditions of today’s healthcare systems, can lead to erosion of good medical care.

If a medical AI is going to make the safest decisions, I think I would take that over a well-meaning but inaccurate human doctor, let alone over an obviously incompetent one.

AI as an adjunct

I see AI as an evolution in the tools doctors already have. I don’t pretend to know the whole of Kumar and Clark’s ‘Clinical Medicine.’ I am, after all, not a robot. But I do rely on reference texts and guidelines when I need them.

In the same way, I look up the dosing of medications in the BNF, I feel the use of AI could be a cross-check adding an extra safety layer in my practice. Using Med-Gemini’s example, if I think a chest x-ray looks normal but the AI says differently, I will probably take a closer look at the image, especially if the AI’s accuracy is as good as a Radiologist’s more often than not.

We haven’t cursed the release of many transformative medical technologies, even when they usurped previous gold standards of treatment or diagnosis. I am incredibly glad an MRI can better determine the nature of a patient’s lump, even though that makes a machine a better diagnostician than me. So too, I believe, AI will become another tool in my medical arsenal, something to be embraced rather than resented.

Redirect attention

It will take some time for AI to become the true deus ex machina, which replaces the human in the medical interaction. However, the hope is that by AI automating the administrative burden, a doctor will have more time to concentrate on the actual patient encounter. With no more endless clicking to order a test, hours of doctor’s time can be spent doing more important things and regaining a sense of balance and health in the working day.

‘Worst’ Case Scenario

Say the feared thing happens and doctors become obsolete, what then? Whilst I don’t think that is likely to happen, these highly talented minds can be directed to other things - maybe focusing on preventative healthcare or maintaining mental wellbeing and flourishing, rather than treating disease. Maybe we’ll all become bona fide creatives, after all, medicine is both an art and a science. I have no doubt us doctors will find something good to do with our time, if we’re no longer needed.

The Digital Healthcare Show: Hot and Not

I'm becoming very familiar with the Excel Stadium. Last week’s Digital Healthcare Show was my second visit to the East London conference centre following February’s Festival of Genomics. Armed with the basics of how to change the channel on the wireless headsets, I had more time to appreciate the latest digital health innovations causing a splash on the scene.

So as with Rewired, here’s the latest trends from one of the biggest digital health conferences in the UK this year.

Hot:

  1. The End-User

    Digital health is finally taking notice of a vital figure in its pathways: the end-user. James Freed’s niftily titled ‘ why your digital transformation will fail’ showed how vital it is to keep the end-user in mind and onboard them appropriately, otherwise, as his examples show, your initiative will fail.

  2. Digital Literacy

    As we’ve already seen, digital health is transforming nearly every aspect of healthcare. Just as Lloyd George envelopes are a thing of the past, the healthcare landscape will transform over the next few decades, and thankfully, leaders recognise the need to bring patients and staff with them into the digital era. The panel discussion chaired by Caroline Stanger highlighted how the digital transformation is conducted at both a local and national level, and Clare Thomson of Imperial College Health Partners gave real world examples of this.

  3. Virtual Reality

    Before you bin your Meta Quest headsets (not that you would at that price), VR may be the next big thing in the medtech scene. Already successfully employed by training simulators e.g. Bodyswaps, VR may also have therapeutic use in conditions like mental health. An interesting panel discussion from Aileen Jackson showed the potential for VR as both a teaching tool and therapy in and of itself. Incredibly exciting is that creative studios like Anagram are now making ADHD simultation experiences, helping people understand what the experience of a mental health condition is like.

Not:

  1. Burnout

    In fact, medtech is targeting healthcare worker burnout in both systems and technology. Ambient voice technology like that of O.S.L.E.R and Nabla could be major factors in reducing the administrative burden in a medical consultation. Another potential lifesaver is robotic process automation (RPA), for example, emailing of public health information, such as that in the Berkshire NHS Health Visiting Team, have been beneficial for both patients and the adminstrative team.

  2. Ignoring the experts

    The importance of clinician input is vital as evidenced in the best digital products. Pathways developed in collaboration with clinicians e.g. Ramai Santhiripala’s streamlined surgical care, stand the best chance of success.

  3. Sitting still

    Digital innovation continues apace and AI still features heavily in the discourse. Certainly in a Digital Healthcare Show I expected momentum, but the pace and belief that tech solutions can solve problems is greater than I expected. We need more of this conviction, alongside proven success and greater clinician and patient involvement to really have impact through digital health.

Women’s Health Reimagined - Talking Tech's Panel Discussion

With a potential market of 50% of the world’s population, the #femtech industry has huge potential. Women’s health, which for so long has been a Cinderella subject in the broader medical field, has in recent years come in to the spotlight. It makes sense that tech would follow suit.

Tonight’s panel discussion hosted by #SODA Socials drew interesting insights from figures in leading women’s health companies like Flo and Elvie. Tackling big questions like data security and challenges in securing funding, the panelists shared their experience in this important field.

Panel discussion:

Femtech is taking off and developments in the area are long overdue. Pablo Solano, Product Designer Manager from Flo, explains how the period tracking app has branched out from a simple cycle calendar app to an emotional communication tool, allowing women to communicate how they feel to their partner via the app.

The latter development was an example of user-driven feedback, which the team took to heart in the app’s development and is now a successful and profitable feature. Whilst I’m not sure whether digital notifications should be the default method of communication in a relationship, if it leads to an open discussion, I suppose that’s a good thing.

Smriti Garga, Senior Product Manager at Elvie, led questions about the particular challenges femtech products face. Using the examples from Elvie’s early days, where founder Tania Boler faced VC pushback about a ‘niche’ product (that pertains to almost 4 billion people), Garga explained how even today the word ‘vagina’ can put investors off. But their products, including their lauded breast pump and digitised pelvic floor trainer, help women manage important features of their health and as the taboo lifts, the industry is likely to go from strength to strength.

The last panelist, Rachel O’Donnell, raised an optimistic vision of the future, championing coding and open discussion of women’s health. Giving useful suggestions like how to make a successful product (look at competitors and work out a niche) and survive a hackathon, O’Donnell spoke to the more tech-literate members of the audience.

The future:

As ever, AI seems to generate a lot of the buzz for the future, with belief that symptom-checking, cycle-analysing chatbots are likely to be key in the field. Yet again tech experts feel the future lies in having an AI intermediary between patient and physician. They may well be right.

Personally, I am looking forward to smart tech that can actually help in diagnosis or better yet, symptom control for many women’s health issues. I think products that have potentially huge impact include self-sampling for cervical cancer (for example Teal) and menopausal symptom control e.g. Grace’s hot flush-combatting bracelet.

For too long women’s health has been a neglected and shamed field. I’m glad I live in an era that sees it given the attention, technological and otherwise, that is deserves.

The Cutting Edge… Imperial’s Diagnostics Showcase

A free lunch, South Kensington and the newest medical diagnostic technology - what’s not to like? Yesterday’s Medtech Links Event from Imperial College London was an inspiring showcase of the medtech being developed by scientists from the prestigious institute.

The phrase ‘it’s impossible until it’s done’ comes to mind when considering some of the devices, but this is exactly what medicine needs - experts who think big and then make it happen.

The event, held in the impressive Royal School of Mines, started with sandwiches and posters of the work that would be discussed in the subsequent sessions. As a lone GP (signified as an ‘external’ audience member by a red dot on my badge) I could see the majority of attendees were from Imperial (with a blue dot on badge). But with many of the posters featuring point of care diagnostics targeted for rollout in primary care (yay!), I wished there were more coalface clinicians at the event to take inspiration and get involved in the innovation.

Unfortunately, I could only attend the first hour of the showcase triptych but the four talks from that session were eye-opening. Sylvain Ladame opened with work from his lab developing direct-to-skin patches that detect cancerous skin lesions using microRNAs. It’s incredible to think that one day a GP may be able to apply a squidgy plaster next to a mole and a day later, have the results back from the lab whether the lesion is likely to be cancerous or not. In combination with the AI tools discussed in my last post from Skin Analytics, the future of skin cancer diagnosis will revolutionise over the next few years.

Delving into smaller and smaller entities, Leah Frenette’s talk about nanozyme technology left the mind boggling. Tiny particles that act like enzymes (ahem, nanozymes) are being used to amplify weak signals, for example the strip on a lateral flow test. Although LFTs for covid and pregnancy tests, seem to work well at the moment, the potential for use with new biomarkers e.g. cancer proteins, could transform how we diagnose and detect diseases.

A company that seems to be well on the way in this process is ProtonDx, whose CEO, Prof Pantelis Georgiou, gave a run-through of their diagnostic technologies, which are already on the market. Using X-Men-like features (magnetic beads!) which amplify nucleic acids from a sample), ProtonDx’s shoebox-sized instrument can perform pathogen screens in less than 30 minutes. For a former hospital doctor, who would often wait days for a respiratory virus screen to come back from the lab, the thought of being able to obtain a diagnosis in such a short time frame and in any context is incredible. And adding further awe to the mix, Prof Georgiou showed ProtonDx’s technology being used in Africa to detect malaria. The potential of such technology is huge.

Rounding off the four talks was Session 1’s Chair, Professor Hanna, explaining the use of breath testing to diagnose upper GI cancers. The surgeon explained how different gastrointestinal tract cancers gave different profiles of volatile organic compounds (VOC), which could be detected from something as simple as a patient breathing into a tube. Although the concept exists already for detecting H. pylori in the gut, the thought of cancer being detected this way is concept-breaking. The technology is now set for validation testing and is being trialled in multiple NHS Trusts across the country.

The underlying thread linking all the innovations discussed was the use of a simple action (collecting a breath, placing a patch or taking a drop of bodily fluid) in an incredibly complex scientific process. For the user, however, the diagnostic result seems to be obtained simply. If I hadn’t understood the science behind it, I would think it worked through magic.

But this isn’t magic, this is science and the best kind: it is diligent, sustained application of knowledge to make medicine appear simple. Ground-breaking and exciting, I cannot wait to hear more.

Dermatology: The Next Frontier

I’m not a fan of dermatology. Not so much because I find skin things weird, but rather I often struggle to diagnose skin conditions. Unfortunately, given that 14% of presentations to a GP involve a dermatological element (1), a significant portion of my clinic is spent puzzling over a mole or rash.

Here to save me are AI-driven medical diagnostics, which have transformed dermatology in recent months. With some platforms boasting 97% accuracy (2) in diagnosing skin cancers, an AI interface between primary care and specialist services could soon be the default in skin pathways.

Major players include Skin Analytics, who lead the field with their AI driven software, DERM, which is the only UKCA approved Class IIa AI as a Medical Device for skin cancer. A skill that used to be the sole remit of consultants with many years experience and learning, can now be performed with astonishing accuracy by a computer.

And the software is democratising in its reach. Platforms like Metaoptima’s ‘Molescope’ offer patients the chance to take high quality photos of their own concerning lesions, to be sent on for further assessment. This is miles ahead of the grainy phone images sent to me at the moment.

With AI taking off, advances are being made at an astonishing rate and these developments are present in almost every medical speciality. From RetinAI’s data processing systems, which give AI driven analysis of retinal OCT images to help diagnose and monitor retinal disease, to the 9 NICE approved AI-driven programs (3), that analyse how best to contour radiotherapy, AI diagnostics are here to stay.

Such advances present philosophical questions about the place of the human physician in all of this. Personally, I have no qualms coming second place to AI when it comes to diagnosing a skin lesion. If it means that diagnostics are safer, quicker and more accurate, I’d go with that system any day. After all, the guiding principle of healthcare is to do good by a patient and if that good is done better, more accurately and safely by a computer, so be it.

In an oversubscribed healthcare system, safe and verified software that takes the burden of diagnosis off a clinician can only be a good thing.

  1. https://bjgp.org/content/70/699/e723

  2. https://skin-analytics.com/performance/

  3. https://www.nice.org.uk/guidance/hte11/chapter/1-Recommendations

Medtech goes to the Movies

Is the future written? Einstein himself wrote:

‘For those of us who believe in physics, the distinction between past, present and future is only a stubbornly persistent illusion.’

If Einstein says so, it’s probable that the future is not as remote and unknowable as we currently think it is.

Even if you don’t go as far as to believe the future has already happened, there is an innate human desire to predict the future. Be it a sneaky glance at the horoscope or a ‘feeling’ about the lottery numbers, many of us would like to know what’s coming up.

But whilst science gets on with the number crunching to make that a reality, as they already have done with weather forecasting and climate change simulations, I turn to less solidly researched fare. I believe the future lies in film.

Just as Douglas Adams’ ‘Hitchhiker’s Guide to the Galaxy’ nailed the concept of the smart phone, here’s my list of films, that may hold the answer to what’s coming up in the medtech scene:

Gattaca

Story: Genome testing at birth, predicts a person’s risk of disease and thus assigns them to a particular stratum in society. Based on his genetic profile the main character, Vincent Freeman, is predicted to die aged 30, which in this eugenics-focused society relegates him to the ‘in-valid’ class. Limited to more menial jobs, in this caste based society, he tries to fool the authorities by using someone else’s DNA in genetic verification tests to achieve his mission of becoming an astronaut.

Reality: We’re not too far off the premise of Gattaca. With initiatives like 100 000 Genomes and the UK Biobank forming repositories of sequenced genomes and associated health profiles, it’s not far-fetched to think that every human may have their genome profiled at birth, to predict their risk of various diseases. It raises interesting ethical questions as to the line between advantageous personalised medicine versus Gattaca-like bias and anxiety about health risks.

How close to the story is reality? 70% of the way there (based on already widespread genome sequencing. The societal implications are not in play… yet)

Never Let Me Go

Story: Children grow up in an isolated boarding school, where they find out they are genetic clones of other human and will eventually be used as organ donors. There’s also a complex love triangle in it.

Reality: Again, the power and potential peril of genetic science is explored. Although we make balk at the idea that a person would ever be used in such a way, the presence of saviour siblings and transplant tourism shows parallels.

How close to the story is reality? 40% (based on the fact that we don’t allow human cloning. The science may be prohibited but unfortunately the ethical issues are already in play).

See also: My Sister’s Keeper

The Matrix

Story: Reality is not all it seems. The iconic Matrix movie draws back the veil to reveal Neo’s perceived daily life (very similar to yours and mine) is actually a computer simulation called the matrix and he actually lives in a bleak robot-ruled world, where is bioelectric energy is siphoned off to power the computers.

Reality: With big hitters like Meta bringing out their VR goggles, the possibility of living a different life and having a different experience becomes more of a possibility. What does this mean in healthtech terms? Anything and everything. The average human being may now be able to experience what it is to run like Usain Bolt amongst other things. The possibilities are huge, empowering and possibly dangerous.

How close to the story is reality? 60% (or perhaps 100% if we are actually in the Matrix)?! With Meta’s goggles, VR is already out there. The question is, how far will it go and will it be used to bring new physical and health-related experiences?

See also: Ready Player One

Chuck

Story: A computer store employee watches a video of encoded snapshot images, which uploads a database of US Governments Intelligence into his brain. Subsequently when seeing certain images, he remembers certain intelligence secrets and with later versions of the program, automatically learns languages and martial arts. Japes ensue.

Reality: I don’t know if we’re there yet but the aspiration is high. The thought of uploading information easily into the human brain is a huge area of interest especially to big players like Neuralink, who are trying to do this with hardware. I’ve always thought there has to be an easier way to remember knowledge rather than rote memorising and revision. Maybe this is the future?

How close to the story is reality? 65% (based on the fact that HRL labs have already improved human cognition speeds using a ‘scalp cap’ and Neuralink’s recent brain-computer interface successes. Subliminal messaging and brainwashing have already been tried and tested but the success rates are uncertain as the CIA just doesn’t release that kind of information.

See also: The Manchurian Candidate, Zoolander

Ex Machina

Story: A software programmer is invited to a billionaire’s private estate to see whether the billionaire’s pet project, building humanoid robots powered by AI, pass the Turing test. Lots of complex relationship dynamics and trickery.

Reality: The thought of building helper robots has long existed and has been posited as a solution for human labour. Imagine having a convincingly intelligent robot-carer to keep everyone happy, healthy and motivated. However, AI poses a huge ethical unknown, especially if it becomes sentient a la Ex Machina. Thought-provoking and highgly relevant in today’s brave new world.

How close to the story is reality? 70% (with the recent advances in AI, machine sentience seems ever closer. And with incredible robotics being designed by the likes of Boston Dynamics, the plot of Ex Machina seems ever more like reality.)

See also: Blade Runner, Humans, I, Robot, Elysium

Red Dwarf

Story: A spaceship worker is put into stasis and wakes up 3 million years later to find everyone on his ship has died and he is alone with only a hologram of his annoying roommate, a super-intelligent computer for company and a bipedal cat.

Reality: This is a fun one for the future. With concepts like bodily stasis and the technology to preserve consciousness, Red Dwarf takes incredible ideas and presents them with the most humorous of touches. After so much dystopia, here is health technology presented in a fun, unthreatening form.

How close to the story is reality? 35% (I don’t know where we are on cryostasis and interplanetary travel but I’d like to see it when it comes).

See also: The Hitchiker’s Guide to Galaxy

The (Virtual) F1 in the Corner

Occasionally a product turns up that changes the way you think about work. So much of my time as a doctor has been spent writing patient notes, that I took it for granted. From my days as a newly qualified Foundation Year 1 (FY1) doctor, I would scribe for my consultants as they did hospital ward rounds, and now as a GP, many of my extra hours of overtime focus on documenting the numerous patients I’ve seen in the day. Honestly, I didn’t think there could be any other way.

But I’m pretty sure that’s all going to change in the next few years.

Yesterday’s launch of TORTUS’ latest medical AI co-pilot software, O.S.L.E.R, showed me that pretty soon each clinician could have their own invisible scribing FY1 right there on their computer.

So how does it work?

Exactly like those consultant ward rounds, the software ‘listens’ to the consultation i.e. records it, and then transcribes it word for word. Using AI wizardry the program then analyses the transcription and writes the potted summary in a templated manner you can adjust to suit you.

The clinician reads through the summary, makes any adjustments and then hey presto! Your consultation is interpreted and documented for you with very little writing effort on your part.

Impressions

I like it. I really really do. For me, it’s like having a medical student doing the writing for you, taking one more task off your hands. Sometimes in the crush of a GP clinic, I have to complete my notes at the end of the session, and patient details blur or are overlooked. This software would take some of that cerebral load off my hands by:

a) having a complete record of the consultation for my reference

b) taking away the effort of writing up notes and minimising typos

c) to some degree, organising my thought processes into the medical notes format (rather like the best kind of FY1).

Limitations

  • There’s something off-putting about having a full audio record and transcription of a patient-doctor consultation. It seems a bit Big Brother to me, even though I know all GP telephone consultations are already recorded.

  • Clinicians may have to adapt their consultation style, speaking out results and thought processes, in order for it to be recorded and then transcribed by the AI. This could throw off the natural flow and interaction between patient and doctor.

  • For the atypical patient consultation e.g. patients who may not ‘stick to a script’ or those using an interpreter, this may pose a level of complexity the AI may not currently be able to deal with. Luckily, I still know how to type and can write this up old-school.

So all in all, it’s an intriguing prospect. For those of us, who are still waiting for TalkType to come to our practice, the medical AI co-pilot seems like a distant vision, however, judging by the competitive market of Tortus, Nabla and Abridge, they are undeniably out there. Whether a legion of despondent NHS doctors have enough in the tank to adopt the software remains to be seen, but we have to have hope that all things, be they working practices, medical software or healthcare systems themselves, will improve over time.

The TORTUS panel discussion involving CEO dom pimenta, annabelle painter, karan koshal and rozell kane

Testing 1, 2, GP

As if on cue, the area in which I work has just started to roll out a new software for GPs to request imaging from our local hospitals. No more longhand forms with blank spaces to write typo-filled reasons why this patient with a chronic headache may warrant a CT-scan. Instead doctors now have to run through a number of questions to submit their request.

So how was a real-time introduction of an order communication system in primary care? Let me take you through…

Training

An 8 minute YouTube video taking you through a shared screen presentation of how to order an investigation from the electronic record system (ERS) that we use. Not bad, I don’t think I need someone in-person holding my hand taking me through or anything longer.

Process

It took at least 10 clicks to submit the request, with some previously overlooked fields now mandatory e.g. prompts to justify the patient’s exposure to ionising radiation. Quite a lot of clicks and drop downs compared to basic text input of the old emailed document.

Advantages

  • The software has more integrated features - I can see recent requested imaging in the same place I’m ordering the test and the results feed back into our ERS directly. I can also see the status of the request e.g. is it pending or completed, which is reassuring compared to emailing a document into the ether.

  • Autopopulating a lot of the answers from the patient’s health record is a great feature and could be revolutionary if it eventually automatically imports blood tests and relevant information without the doctor having to look this up themselves. Real potential here…

Disadvantages

  • Transition to a new system is always a strain and I know there will be some patients who will be caught out. Undoubtedly some old referrals will be rejected or missed during the switch to the new electronic system.

  • The UX could be improved. There are still quite a lot of little clicks and decisions, which at the end of a full clinic, I have little patience for. Sometimes freestyle form-filling is preferable to the ‘all fields must be completed’ computer pedantry.

  • I haven’t pushed the system yet, but sometimes a doctor’s request doesn’t tick all the boxes. My reasons may not fit all the guidelines but I still want a test done. I wonder whether the current software will allow for blurred lines or if it is too rigid in its specifications? Watch this space…

Rewired Retro-ed: What’s hot and what’s not

And that’s a wrap. The posters have been put in their cardboard tubes and the games and gimmicks packaged away until the next conference. Two days of the latest in Digital Heatlh #Rewired2024 is over and Birmingham NEC can devote itself to the next trade fair. And I have a lot to think about.

You couldn’t attend? Not to worry, I’ve got the scoop. Here’s the latest in the Digital Health world:

Hot

  1. Digital Decision Support Tools (DDST)

    The flavour of the month seems to be DDSTs - online programs that ask algorithmic questions whilst incorporating and adhering to guidelines. For clinicians it is the multi-click equivalent of going through the WordArt flowcharts on the consulting room walls. Big hitters like ICE and VAR Healthcare are all implementing their own systems and have purported success. But some queries from my end as to decision fatigue if a computer’s making you jump through hoops.

  2. Remote Patient Monitoring

    With a push to save on NHS appointments and in-person contacts, remote patient monitoring is increasingly en trend. Nifty image transferring software like Isla Health and Accurx are trying to bridge the virtual gap between patient and clinician.

  3. Innovation

    Buzzword of the moment (second only to AI). Heartening to see that all improvements are welcomed with NHS Innovation’s platform open to small innovators as well as large. However, I wonder whether innovation is the new ‘resilience’ and rather a replacement term for wholescale improvement in working conditions in the NHS.

  4. AI

    No tech conference today would be complete without AI making it into the three most mentioned topics. AI our saviour, miracle and possible paperclip-apocalypse frenemy. Lots of talks, lots of promises, AI is (quite literally with its chatbots) the talk of the town.

What’s not:

  1. Primary Care

    Pilot projects galore being implemented in Secondary Care but less in Primary Care. C’mon guys! With the largest volume of NHS activity occurring in primary care (1) companies miss out on a huge potential market. But it’s reassuring that companies like Isla Health and Better are hoping that 2024 is the year of the GP. Hope that’s not just lip service…

  2. Informed consent

    Data, data everywhere, but unanswered questions as to who gets to use it and whether patients fully realise what their information can be used for. Data may be anonymised, but can then be incorporated into many weird and wonderful algorithms looking at health predictions and risk. I wonder if patients truly know what is happening to their data. But then again, its no worse than what Google or Meta are doing…

  3. The Working Reality

    I love the positivity of med tech, I truly do. EPRS, DDST and AI all promise a revolution in terms of efficiency and if it’s true, kudos. However, we haven’t yet done away with doctors in the NHS (getting there though!) and for all the miracle solutions offered by the many companies at Rewired, I wonder whether the actual fundamentals of the crisis in the NHS (lack of nurtured, respected workforce, complex patients with health and social issues etc) are being ignored in favour of a neat, digitised UI.

So that’s my take on it. Fascinating talks, great achievements and food for thought from Digital Heatlh Rewired 2024.

(1) https://www.kingsfund.org.uk/insight-and-analysis/data-and-charts/NHS-activity-nutshell

Selfie successfully obtained, thanks rewired for making it clear

How to Win at Conferencing

For those of us who follow the med tech world, it’s conference season. With Rewired kicking off in Birmingham, I’m going to give you the beginner’s guide to making the most of any conference you go to:

  1. Grab a program, pen and paper

    It may be sacrilege at a digital conference to suggest analogue, but sometimes you can’t beat pen and paper. At the start of any conference I grab a program and look at the different streams of talks and events. I circle the must-sees in one colour and the maybes in another and then use that to structure my day.

  2. Do a recce

    When it comes to exhibitors, I always start each conference by doing a walk round of the different stands, seeing what’s out there and what looks interesting (this also works for scouting out good freebies or competitions). Using my floor-plan I circle stands to come back to and tick them off between must-see talks.

  3. Make use of the cloakroom

    Wet umbrellas, winter coats and too much stash? Remember most conference venues will have a cloakroom, where you can dump your stuff. Don’t risk unnecessary shoulder ache from a heavy bag. Check it in.

  4. Network

    Conferences are a great way to meet new people and make connections with exciting companies and organisations. Have the Linkedin app downloaded and add new contacts there and then using the QR system.

  5. Take breaks and know when to call time

    You don’t have to see everything, everywhere, all at once (although do see the movie - it’s excellent). Know that you won’t be able to listen to every speech or visit every interesting stand. There may be an element of FOMO but that’s ok - there are always more conferences and other opportunities to connect e.g. reaching out online.

So those are my 5 top tips for conferences. Enjoy the chocolate miniatures and I’ll see you there!

Learn from my mistakes earlier at the best practice conference at olympia - check the bag in!

The EPRS of a Doctor's Dreams

When I started as a doctor, over a decade ago, documentation methods were pretty much as they had been for the previous 20 years before that. At the end of each ward was a huge clunky trolley, which held patient notes. The sacrificial lamb for the ward round (usually an F1) would be tasked with wheeling the trolley around so the consultant’s review could be recorded in real time. Some patient notes would be so big, the manila folders would break, leaving an unwieldy set of papers held together with a treasury tag.

paper patient notes… A fossil?

Archaic is not the word.

Over my career I’ve seen the digitisation of the health system. NHS Digital states that now 90% of trusts use electronic patient record systems (EPRS) (1) in line with the Government’s 2022 Plan for Digital Health and Social Care (2). But considering these softwares fall under the umbrella of med tech, a field that boasts electronic medical marvels like cochlear implants and automatic defibrillators, many medical software systems remain impractical and poorly designed.

In my day to day life as a GP, EPRS are the digital health technologies I interact with most. Instead of the giant paper notes of yore, they are the systems recording the patient’s medical history on computers and clouds. When they are good, they can be very good, shearing off minutes from my precious 10 minute consultation, which I can then give back to my patient.

“But when EPRS are bad, they can be horrid. In the worst case scenario, EPRS with non-intuitive design can lead to prescribing errors.”

But when EPRS are bad, they can be horrid. In the worst case scenario, EPRS with non-intuitive design can lead to prescribing errors, one of the worst mistakes to make due to the direct impact on the patient. Of the lesser evils are slow systems, which freeze when trying to work at GP light speed or tech rollouts, that don’t take time to onboard its users. Some of this can be mitigated by having intuitive, smart design, which minimises the need for multiple mandatory training sessions (med tech companies could take a leaf out of Apple’s child-friendly UX). We may be doctors, but that doesn’t mean we want our EPRS to be the equivalent of sitting the BMAT on a daily basis.

So if I could shout out to any EPRS company out there (hey EMIS! I see you!)

The EPRS of a GP’s Dreams:

Is:

  1. Safe

(Simple steps to prescribe medications and record information. Warnings in place if an error looks like it’s being made)

  1. User-friendly

(Not too many steps to log-on, do a task, streamlined, intuitive)

  1. Fast

(No freezing/crashes please)

That’s not too much to ask is it?

  1. https://digital.nhs.uk/news/2023/90-of-nhs-trusts-now-have-electronic-patient-records

  2. https://www.gov.uk/government/publications/a-plan-for-digital-health-and-social-care/a-plan-for-digital-health-and-social-care