Received:25January2024 Accepted:13February2024 DOI:10.1111/jmi.13282 INVITED REVIEW Theriseofdata-drivenmicroscopypoweredbymachine learning LeonorMorgado1,2EstibalizGómez-de-Mariscal1HannahS.Heil1 RicardoHenriques1,3 1OpticalCellBiology,Instituto GulbenkiandeCiência,Oeiras,Portugal 2Abbelight,Cachan,France 3UCL-LaboratoryforMolecularCell Biology,UniversityCollegeLondon, London,UK Correspondence HannahS.HeilandRicardoHenriques, OpticalCellBiology,InstitutoGulbenkian deCiência,Oeiras,Portugal. Email:hsheil@igc.gulbenkian.pt and rjhenriques@igc.gulbenkian.pt Fundinginformation EuropeanUnion,Grant/AwardNumbers: 101057970-AI4LIFE, 101099654-RT-SuperES;LS4FUTURE AssociatedLaboratory,Grant/Award Number:LA/P/0087/2020;Instituto GulbenkiandeCiência;Fundaçãoparaa CiênciaeaTecnologia,Grant/Award Number:CEECIND/01480/2021; EuropeanMolecularBiology Organization,Grant/AwardNumbers: EMBO-2020-IG-4734,ALTF499-2021, ALTF174-2022;Abbelight;Chan ZuckerbergInitiative,Grant/Award Numbers:vpi-0000000044, NP2-0000000085;EuropeanResearch Council,Grant/AwardNumber: 101001332;CalousteGulbenkian FoundationAbstract Optical microscopy is an indispensable tool in life sciences research, but con- ventional techniques require compromises between imaging parameters like speed,resolution,fieldofviewandphototoxicity.Toovercometheselimitations, data-driven microscopes incorporate feedback loops between data acquisition andanalysis.Thisreviewoverviewshowmachinelearningenablesautomated image analysis to optimise microscopy in real time. We first introduce key data-driven microscopy concepts and machine learning methods relevant to microscopy image analysis. Subsequently, we highlight pioneering works and recent advances in integrating machine learning into microscopy acquisition workflows,includingoptimisingillumination,switchingmodalitiesandacqui- sitionrates,andtriggeringtargetedexperiments.Wethendiscusstheremaining challengesandfutureoutlook.Overall, intelligentmicroscopesthatcansense, analyse and adapt promise to transform optical imaging by opening new experimentalpossibilities. KEYWORDS data-driven,imageanalysis,machinelearning,reactivemicroscopy 1INTRODUCTION Opticalmicroscopytechniques,suchasbrightfield,phase contrast, fluorescence and super-resolution imaging, are Thisisanopenaccessarticleunderthetermsofthe CreativeCommonsAttribution License,whichpermitsuse,distributionandreproductioninanymedium,providedthe originalworkisproperlycited. ©2024TheAuthors. JournalofMicroscopy publishedbyJohnWiley&SonsLtdonbehalfofRoyalMicroscopicalSociety.widelyusedinlifesciencestoobtainvaluablespatiotem- poral information for studying cells and model organ- isms. However, these techniques have certain limitations with respect to critical parameters such as resolution, J.Microsc.2024;1–8. wileyonlinelibrary.com/journal/jmi 1 ----!@#$NewPage!@#$---- 2 MORGADOetal. FIGURE 1 Data-drivenmicroscopyworkflow.(A)Pyramidoffrustration:trade-offsinacquisitionparametersarevisualisedasa pyramid,highlightingtheinterdependenceofsignal-to-noiseratio(SNR),samplehealth,temporalresolution,spatialresolution,andthe extendtothefieldofview,3Dvolumeandmultiplexing( x,y,z,𝜆dimensions).Enhancingoneparametertypicallycompromisesatleastone other.(B)Schematicofworkflow:acquisitioncontrolsoftware:software-drivencontrolofmicroscopehardwareforimagecapture; microscopehardware:imagingdeviceswithprogrammaticinterface;customreactiveagent:integrationofacustomreactiveagentthat analysesliveacquisitiondata,providingreal-timefeedbacktothesoftwaretoadapttheacquisitionparameters.(C)Acquisitionstages: observationstage(blue):thesampleiscontinuouslymonitoredusingasimpleimagingprotocol,suchasbrightfield(thisstageisnon-invasive andpreservessamplehealth)andreactivestage(magenta):upondetectionofatargetevent(e.g.acellenteringaspecificcell-cyclestage),the reactiveagentinitiatesafluorescenceimagingprotocol,enablingdetailedobservationoftheevent. acquisitionspeed,signal-to-noiseratio,fieldofview,extent of multiplexing, z-depth dimensions and phototoxicity. The trade-offs between these critical imaging parameters are often represented within a ‘pyramid of frustration’ (Figure1A). Although improving hardware can extend capabilities, optimal balancing depends on the imaging context.Especially,asscientificresearchdelvesintomore complexquestions,tryingtounderstandthemechanisms ofcellandinfectionbiologyatamolecularlevelinphysi- ologicalcontext,traditionalstaticmicroscopesmaynotbe sufficienttocapturerelevantdynamicsorrareevents. Innovative efforts focus on overcoming these restric- tions through integrated automation. Data-driven micro- scopes employ real-time data analysis to dynamically controlandadaptacquisition(Figure 1B).Thecoreconcept involves introducing automated feedback loops between image-datainterpretationandmicroscopeparameterstun- ing.Quantitativemetricsextractedviacomputationalanal- ysisthendictateadaptiveprotocolstailoredtophenomenaofinterest.Thesystemreactstopredefinedobservational triggersbyoptimisingimagingparameters–suchasexci- tation, stage position, and objective lenses – to capture criticaleventsefficiently(Figure 1C). Image analysis algorithms are pivotal in data-driven methodologieswithcustomisedapproachesservingalarge variety of situations. These approaches can use machine learning techniques to perform tasks such as classifica- tion, segmentation, tracking, and reconstruction without theneedforexplicitprogramming.Byintegratingmachine learning, intelligent microscopes can make contextual decisions by identifying subtle features that traditional rule-based software may miss. Thus, these data-driven principles are able to increase the efficiency of image acquisitionandenrichtheinformationcontentsindiverse scenarios,especiallyinhighthroughputandhigh-content imaging. It enables to capture discrete and rare events at different temporal and spatial scales and relate it to pop- ulation behaviour. This information cannot be accessed with classical imaging approaches, especially because 13652818, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jmi.13282 by Cochrane Portugal, Wiley Online Library on [30/05/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License ----!@#$NewPage!@#$---- MORGADOetal. 3 they would require extended exposure to cell damaging imagingconditions. In this review, we will first introduce the concept of data-drivenmicroscopyandthecommonmethodsusedto addressmicroscopychallenges.Then,wewillexplainthe principles and frameworks that enable reactive machine learning-baseddata-drivensystems.Finally,wewillshow- casevariousapplicationsthatbenefitfromtheintegration of data-driven microscopy to highlight new experimen- talpossibilities. 2DATA-DRIVENMICROSCOPY Data-drivenmicroscopescananalyseimagingdatainreal timeandexecutepredefinedactionsuponspecifictriggers. These reactive systems feature feedback loops between quantitativeimageanalysisandmicroscopecontrol,which allowsthemtotailordataacquisitiontoobjectsorphenom- ena of interest. Specifically for event-driven approaches, thistriggercanbebasedondetectingtheoccurrenceofa specificevent.Byimplementingthepredictionofstatesof interest even smart or intelligent microscopy approaches canberealised. A recent work showcasing event-triggered protocol switchingisbyOscarAndréetal.1They,forexample,per- formed dual-scale imaging of host-pathogen interactions Data-driven microscope: The data-driven microscope integrates advanced computational techniques into its imagingcapabilities.Itusesmachinelearningalgorithms and real-time data analysis to automatically adjust the acquisition parameters. This way it is possible to opti- mise imaging conditions, enhance image quality and extractmeaningfulinformationwithoutheavyrelianceon manualintervention.using a co-culture model. The system first continuously scannedmultiplefieldsofviewofthesampleatlowmag- nification to monitor interactions between fluorescently labelled human cells and bacteria. An integrated algo- rithm analysed each frame to detect interaction events basedonproximityanalysis.Upondetectingatargetnum- berofcell-bacteriainteractions,thesystemautomatically switched to a higher numerical aperture objective and acquisition speed and imaged the identified interactions. This allowed to capture the cellular actin remodelling induced by the infection at high temporal-spatial res- olution. This dual-scale approach balances population- levelbehaviouralmonitoringandtargetedhigh-resolution data collection in a highly efficient and high-content manner. In super-resolution microscopy, data-driven strategies help mitigate inherent trade-offs between resolution, speed,fieldofviewandphototoxicityduringliveimaging. A system by Jonatan Alvelid et al.2combines fast wide- fieldsurveillancewithpreciselytargetednanoscopyimag- ing.Forinstance,culturedneuronsexpressinggenetically encodedcalciumindicatorswerecontinuouslymonitored with widefield imaging to detect neuronal activity. Real- timeanalysisofcalciumdynamicsallowedthedetectionof spike events and localisation of regions of interest. Upon spike detection, the system rapidly positioned and acti- vated Stimulated Emission Depletion (STED) nanoscopy illuminationatidentifiedsitestovisualisesynapticvesicle dynamics. By limiting high-intensity light exposure spa- tially and temporally only when critical events occurred, thisselectivesuper-resolutionimagingapproachreduced cumulativephotondosebyover75%comparedtocontinu- ousSTEDacquisition. Beyond adjusting microscope hardware, data-driven systems can coordinate external experimental devices by integrating microfluidicscontrolsoftware. Anautomated live-to-fixed cell imaging platform called NanoJ-Fluidics, developed by Pedro Almada et al.,3performs buffer exchange directly on the microscope stage. The system uses simple epifluorescence image analysis to detect cell roundingattheonsetofmitosis.Uponroundingdetection, NanoJ-Fluidicstriggersfixation,permeabilisationandflu- orescentlabellingthroughsequentialperfusion,preparing thecellsforsubsequentsuper-resolutionimaging. These examples showcase the reliance on traditional image analysis techniques to identify events of interest, whichtypicallyinvolvesignalcolocalisation,intensity,or shape thresholds. However, the integration of machine learning-based image analysis can elevate reactive data- drivenmicroscopytoanewlevelbyenablingthedetection of subtle and complex features that would otherwise gounnoticed. 13652818, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jmi.13282 by Cochrane Portugal, Wiley Online Library on [30/05/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License ----!@#$NewPage!@#$---- 4 MORGADOetal. FIGURE 2 Overviewofmachinelearningconceptsformicroscopyimageanalysis.(A)Schematicdepictingthatdeeplearningisa subsetofmachinelearning,whichisinturnasubsetofartificialintelligence.(B)Schematicofmajorstepsontrainingandusingamachine learningmodel.First,dataiscollected,preprocessedandsplitintotraining,validationandtestdatasets.Modelsaretrainedonthetraining datasetandtrainingisevaluatedwiththevalidationsettopreventoverfitting.Oncetrained,aqualitycontrolofthemodelisdoneusingan independenttestdatasetandifpositive,themodelcanbeusedtogeneratepredictionsonnewunseendata. 3MACHINELEARNINGFOR AUTOMATEDMICROSCOPYIMAGE ANALYSIS Recentadvancesinmachinelearning,particularlyindeep learningneuralnetworks,haverevolutionisedautomated imageanalysisformicroscopy.Bytrainingonasufficient amountofdata,machinelearningmodelscanachieveor surpasshumanperformanceincompleximageprocessing tasks such as cell identification, structure segmentation, motion tracking and signal or resolution enhancement. Different models excel in various aspects crucial for enhancing microscopy imaging experiments. In this sec- tion, we will introduce fundamental machine learning concepts and highlight learning strategies well suited for microscopyimagingtasks. Machine learning involves algorithms that learn pat- ternsfromdatatomakepredictionswithoutexplicitpro- gramming. It falls under the umbrella of artificial intelli- gence,aimingtoimitateintelligentbehaviour(Figure 2A). Through a training process, the algorithms tune the parameters of a specific image processing model to per- formoneparticulartask.Thus,machinelearningpractice requires training data, validation data and test data. The lattertwodatasetsareusedtoevaluatetheperformanceof themodelduringandafterthetraining,respectively.Upon apositivequalityevaluationresult,themodelcanbeused in new unseen data to make the predictions (Figure 2B). In supervised learning, the model is trained on matched input and output examples, like images and labels, to infer general mapping functions. Unsupervised learning finds intrinsic structures within unlabelled data through techniques like clustering. As a third training category, self-supervisedmethodsrunwithunlabelleddataasthey derivesupervisoryfeaturesfromnaturalcharacteristicsof thedataitself.A relatively simple but powerful machine learning model is the support vector machine (SVM)4(Figure3). SVMsexcelat classification taskssuchasidentifyingcell typesinimages.SVMsploteachimageasapointinamul- tidimensional feature space and tries to find the optimal dividinghyperplanebetweenclasses.Newimagesareclas- sifiedbasedonwhichsideofthehyperplanetheirfeatures fall on. SVMs have good generalisation ability provided that the features extracted from the classes are descrip- tiveenoughastocharacterisethem.Inmicroscopy,SVMs are often used for initial proof-of-concept experiments to classifyimagesintobinarycategorieslikemitoticornon- mitotic cells. Their simplicity makes SVMs convenient forimplementingbasicfeedbackloops,forinstance,trig- gering high-resolution imaging when a specific cell type isdetected. Deep learning models including convolutional neural networks (CNNs) are state-of-the-art for complex image processing tasks. CNNs are made up of artificial neu- rons trained to recognise patterns in image data. One of the most influential CNN architectures is the U-Net5 (Figure3), which was first introduced in 2015. U-Nets have encoder layers that capture hierarchical contextual information anddownsample thedata, anddecoderlay- ers which rebuild the information back into a detailed map using information from the encoder path passed through the skip connections. Compared to SVMs, U- Netscanhandlerawimagesbyautomaticallyextractinga richfeaturerepresentation,andtherefore,itperformsbet- terondatasetswithincreasedcomplexity.Inmicroscopy, U-Nets excel at segmentation tasks like identifying and delineating different cell types, nuclei or components in the image. Their ability to recognise complex structures based on contextual understanding of images makes U- Netswellsuitedforimplementingdata-drivenmicroscopy feedbackloops.Forexample,U-Netscouldbeusedtoalter 13652818, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jmi.13282 by Cochrane Portugal, Wiley Online Library on [30/05/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License ----!@#$NewPage!@#$---- MORGADOetal. 5 FIGURE 3 Comparativeanalysisofmachinelearningalgorithmsindata-drivenmicroscopy.Comparisonofvariousmachinelearning (ML)algorithmsemployedindata-drivenmicroscopy,delineatingtheirrespectiveadvantages,limitations,andapplicationsforimage analysistasks. illuminationormagnificationwhenspecificcellularstruc- turesaredetected. An additional powerful class of machine learning approachesgainingtractioninmicroscopyaregenerative adversarial networks (GANs)6(Figure3). GANs contain paired generator and discriminator networks trained in an adversarial manner. The generator creates synthetic imagestomimicrealdata,whilethediscriminatorclassi- fiesimagesasrealorfake.Competingdrivesthegenerator to produce increasingly realistic outputs. In microscopy, GANsareappliedfordataaugmentation,imageenhance-ment,modalitytranslation,andsimulation.Forinstance, GANscancreatediversetrainingdata,convertbrightfield tofluorescence-likeimages,orsimulateimagesunderdif- ferent conditions. A major benefit of GANs is that it can be unsupervised and thus no labelled or paired datasets are needed to train them. For smart microscopes, GANs couldenablepreprocessingloopstoimproveimagequal- itybeforeanalysisandprovideanestimateofvariabilityor confidenceinthegeneratedresultstoprioritisetasks.They alsoholdpromiseforpredictingnanoscaleinformationto guidesuperresolutionimaging. 13652818, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jmi.13282 by Cochrane Portugal, Wiley Online Library on [30/05/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License ----!@#$NewPage!@#$---- 6 MORGADOetal. Machinelearningprovidesdata-drivenmicroscopywith flexibilityandempowersfasterandmoreadaptativeimag- ing workflows. Trained models extract relevant informa- tion from images that is then used to optimise data col- lectionbyadaptingmicroscopeparametersaccordinglyin realtime.Thistransformativepotentialhasbeendemon- stratedacrossdiverseimagingmodalities,ashighlightedin thenextsection. 4APPLICATIONSOFMACHINE LEARNINGPOWEREDREACTIVE MICROSCOPY Artificialintelligenceisdrivingthedevelopmentofintel- ligent microscopes that can sense, analyse and adapt in realtime.Recentinnovationshavedemonstratedreactive imaging systems across various modalities, ranging from widefield to super-resolution microscopy. In this section, wewilldiscussthekeyapplicationsofmachine-learning- poweredreactivemicroscopy,highlightingthepotentialof thesesystemstorevolutioniseopticalimaging. MicroPilot,7a software that provides a framework for data-driven microscopy, is one of the pioneering works in the field. The system is based on LabView and C for image analysis and can be implemented in various com- mercialsystems.ItisalsocompatiblewithMicro-Manager, an open-source tool widely used to control microscopes. The MicroPilot study provides differentexamples of how cellscanbemonitoredinlow-resolutionmode,andimages can be analysed using a SVM trained to classify differ- ent mitotic stages. A complex experiment is triggered once a cell in a desired stage is detected. After comple- tion,theimagingreturnstoscanningmodeuntilthenext detection.Todemonstrateitscapacity,anexperimentwas conductedtostudythepotentialroleofaspecificprotein in the condensation of mitotic chromosomes. The exper- iment monitored 3T3 cells and it triggered Fluorescence RecoveryAfterPhotobleaching(FRAP)acquisitionsupon identification of each prophase cell. Half of the nucleus wasselectivelyphotobleached,andthesignalrecoverywas monitored.Itisworthnotingthatthissetofexperiments wascompletedinjustfourunattendednights,generating resultsequivalenttowhatwouldhavetakenafullmonth foranexpertuser. Buildingonthisconcept,MicroMator8wasdevelopedto offeranopen-sourcetoolkitforreactivemicroscopyusing PythonandMicro-Manager.ItincludespretrainedU-Net models to segment yeast and bacteria cells. Researchers applied MicroMator to selectively manipulate targeted cells during live imaging. One noteworthy example involves an optogenetically driven recombination exper- iment in yeast. In this experiment, genetically modifiedyeast cells are selected for exposure to light, triggering recombination and the subsequent expression of a pro- teinthatarrestsgrowthtogetherwithafluorescentprotein for monitoring purposes. To generate islets of recom- bined yeast, MicroMator’s algorithm individually selects yeast cells at a minimum distance apart, tracks them and repeatedly triggers light exposure on them, increas- ing the chances of recombination and, thus, the amount ofrelevantinformationintheacquireddata. In addition to enhancing data information density, the imagequalitycanbedynamicallyoptimisedbasedonthe sampleproperties.Oneexampleofthisisthelearnedadap- tive multiphoton illumination (LAMI)9approach, which usesaphysics-informedmachinelearningmethodtoesti- matetheoptimalexcitationpowertomaintainagoodSNR across depth in multiphoton microscopy. This becomes particularlyrelevantfornonflatsampleswithvaryingscat- tering regions. Given the surface characteristics of the sample, LAMI selectively adjusts the excitation power whereneeded,effectivelyexpandingtheimagingvolume by at least an order of magnitude, while minimising the potential for photodamage effects. The effectiveness is also demonstrated by observing immune cell responses to vaccination in a mouse lymph node with live intravi- talmultiphotonimaging.Furthermore,LAMIsignificantly reduces computation time by incorporating a machine learning-based method instead of a purely physics-based approach.Thecomputationtimeisreducedfromapprox- imately one second per focal time point to less than onemillisecond. InastudyconductedbySulianaManleyandherteam, they aimed to image mitochondria division using Struc- tured Illumination Microscopy while minimising photo- damageeffects.10Toachievethis,theytrainedaU-Netto detectspontaneousmitochondriadivisionsindual-colour images labelling mitochondria and the mitochondria- shaping dynamin-related protein 1. The model was inte- grated into the imaging workflow to trigger interchange- ably the acquisition mode from a slow imaging rate, suitable for live-cell observation, to a faster imaging rate, enabling the collection of higher time-resolved data of mitochondrial fission. Interestingly, the similarity in the morphological characteristics and protein accumulation at fission sites allowed the network to be repurposed for detectingfissioneventsinthebacteria C.crescentus with- out additional training. The research team quantitatively assessed and compared photobleaching decay among the slow,fastandevent-drivenacquisitions.Whencompared tothefastmode,theyobservedasignificantreductionin photobleachingusingtheevent-drivenacquisitionmode, therebyextendingthedurationofimagingexperiments.As expected,thisreductionisnotasbigaswiththeslowacqui- sitionmode,butitcomeswiththebenefitofcapturingthe 13652818, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jmi.13282 by Cochrane Portugal, Wiley Online Library on [30/05/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License ----!@#$NewPage!@#$---- MORGADOetal. 7 eventwithhighertemporalresolutionandthusthemea- surementofanaveragesmallerconstrictiondiametersthat wouldhavebeenotherwisemissed. Lastly, the work of Flavie Lavoie-Cardinal’s research groupfocusesoncapturingtheremodellingprocessofden- dritic F-actin, transitioning from periodic rings to fibres, withinlivingneuronswithSTEDimaging.11Theymonitor cells with confocal microscopy based on which synthetic STEDimagesaregenerated,considerablyreducingphoto- damageeffects.Forthis,theyemployatask-assistedgen- erativeadversarialnetwork(TA-GAN).TA-GAN’straining isstrengthenbyalsoconsideringtheerrorofactinringand fibressegmentationinthesyntheticimages.Duringacqui- sition, the system estimates the uncertainty of the model predictingsyntheticimagestodecidewhethertoinitiatea realSTEDimageacquisitionandfine-tunethegenerative modelifneeded.Thisallowstotrackactinremodellingin stimulatedneuronsathighaccuracyandresolution.Their results illustrate that this strategy can potentially reduce the overall light exposure by a significant margin, up to 70%andimportantly,theymanagetoacquirebiologically relevantlivesuperresolutiontimelapseimagesfor15min. By integrating machine learning into the microscopy workflow, researchers have showcased techniques to enhance data quality and quantity while minimising phototoxicity. The applications highlighted in this sec- tiondemonstratethetransformativepotentialofmachine learning-powered microscopes across diverse imaging modalities and biological questions. As machine learn- ing methods and computational power continue advanc- ing, we can expect even more breakthroughs in intelli- gent microscopy, bringing us closer to the goal of fully automated, optimised imaging platforms that accelerate biologicaldiscovery. 5CONCLUSIONSANDOUTLOOK Data-driven microscopy has demonstrated impressive capabilities in optimising illumination, modality switch- ing, acquisition rates and event-triggered imaging. These approaches improve image acquisition’s efficiency and information content, enabling studying dynamic biolog- ical processes across different scales. Intelligent micro- scopes offer new experimental possibilities, from observ- ing rare neuronal activity at the nanoscale resolution to studyingimmunecelldynamicsintissues. However, realising the full potential of data-driven microscopy requires addressing technical and practical challenges. One major limitation is the need for robust and accurate machine learning models, especially when dealingwithsmallmicroscopydatasets.Expandingopen-source repositories of annotated images and simulations can facilitate the development and validation of new algorithms.Additionally,incorporatingunsupervisedand self-supervised techniques shows promise in overcoming thescarcityoflabelleddata. Anothercriticalaspectisthedesignofmicroscopehard- wareoptimisedfordata-drivenimaging.Retrofittinganal- ysisandcontrolmodulesintotraditionalsystemsiscom- mon, but purpose-built instrumentation that integrates software,optics,detectorsandautomationisessential.For example,spatiallightmodulatorscanenablerapidadapt- able illumination for optimal signal-to-noise ratio across differentsamples.Onthedetectionside,high-speed,low- noise cameras or point-scanning systems tailored for live imagingcanenhanceacquisitionspeeds. In order to increase the use of data-driven microscopy software,itneedstobemademoreuser-friendlyandacces- sible.Thiscanbeachievedbycreatingsimplifiedinterfaces fordesigningandexecutingreactiveimagingexperiments, allowing nonexpertstotakeadvantageoftheseadvanced methods. Expanding open-source platforms like Micro- Manager will encourage community contributions and drive innovation. Additionally, package managers that facilitate the sharing and installation of pretrained mod- els can help overcome barriers in deploying machine learningsolutions. As data-driven microscopy moves beyond proof-of- conceptstudies,ensuringtherobustnessandreproducibil- ity of autonomous microscopes becomes crucial. Main- tainingimagequalitycontrolanddetectingfailuresduring unsupervisedoperationforextendeddurationischalleng- ing. Detailed performance benchmarking across labora- tories using standardised samples can help identify best practices.Whilethisapproachcanbeagreatassetinmin- imisinguserbias,aselectionbiasindecisionmakingcan stillarise.Here,extensivevalidationofmachinelearning predictionsandadaptivedecisionsisrequiredtobuildtrust inintelligentsystems. Data-drivenmicroscopyrepresentsaneweraforoptical imaging, overcoming inherent limitations through real- time feedback and automation. Intelligent microscopes have the potential to transform bioimaging by opening upnewexperimentalpossibilities.Pioneeringapplications demonstrate the ability to capture dynamics, rare events, and nanoscale architecture by optimising acquisition on- the-fly. While challenges in robustness, accessibility, and validation remain, the future looks promising for micro- scopes that can sense, analyse, and adapt autonomously. We envision data-driven platforms becoming ubiquitous toolsthatempowerresearcherstoimagesmarter,notjust faster.Thenextgenerationofautomatedintelligentmicro- scopes will provide unprecedented spatiotemporal views 13652818, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jmi.13282 by Cochrane Portugal, Wiley Online Library on [30/05/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License ----!@#$NewPage!@#$---- 8 MORGADOetal. intobiologicalprocessesacrossscales,fuellingfundamen- taldiscoveries. AUTHOR CONTRIBUTIONS L.M.,H.S.H.andR.H.conceptualisedthemajorityofthe manuscript. L.M. wrote the manuscript with input from H.S.H. and R.H. E.G-M. contributed critical comments andconceptualsuggestionstoimprovethemanuscript.All authorsreviewedandeditedthemanuscript. ACKNOWLEDGEMENTS This work was supported by the Gulbenkian Foun- dation (LM, EGM, HSH, RH), received funding from the European Union through the Horizon Europe pro- gram (AI4LIFE project with grant agreement 101057970- AI4LIFE, and RT-SuperES project with grant agree- ment 101099654-RT-SuperES to R.H.) and the European Research Council (ERC) under the European Union’s Horizon2020researchandinnovationprogramme(grant agreement No. 101001332 to R.H.). Views and opinions expressed are however those of the authors only and do notnecessarilyreflectthoseoftheEuropeanUnion.Nei- thertheEuropeanUnionnorthegrantingauthoritycanbe held responsible for them. This work was also supported bytheEuropeanMolecularBiologyOrganization(EMBO) (Installation Grant EMBO-2020-IG-4734 to RH and the postdoctoralfellowshipsALTF499-2021toHSHandALTF 174-2022 to EGM), the Fundação para a Ciência e Tec- nologia, Portugal (FCT fellowship CEECIND/01480/2021 toHSH),theChanZuckerbergInitiativeVisualProteomics Grant (vpi-0000000044 with DOI:10.37921/743590vtudfp to R.H.) and the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundations (Chan Zuckerberg Initiative Napari Plugin Foundations GrantCycle2,NP2-0000000085grantedtoR.H.).R.H.also acknowledgesthesupportofLS4FUTUREAssociatedLab- oratory (LA/P/0087/2020). LM is kindly supported by a collaborationbetweenAbbelightandtheIntegrativeBiol- ogyandBiomedicine(IBB)PhDprogrammefromInstituto GulbenkiandeCiência. CONFLICT OF INTEREST STATEMENT Theauthorsdeclarenocompetingfinancialinterests. ORCID LeonorMorgado https://orcid.org/0000-0003-4510-8456 EstibalizGómez-de-Mariscal https://orcid.org/0000- 0003-2082-3277 HannahS.Heil https://orcid.org/0000-0003-4279-7022 RicardoHenriques https://orcid.org/0000-0002-2043- 5234REFERENCES 1. André, O., Ahnlide, J. K., Norlin, N., Swaminathan, V., & Nordenfelt, P. (2023). Data-driven microscopy allows for auto- mated context-specific acquisition of high-fidelity image data. Cell Reports Methods ,3(3). ISSN 26672375. https://doi.org/10. 1016/j.crmeth.2023.100419 2. Alvelid, J., Damenti, M., Sgattoni, C., & Testa, I. (2022). Event- triggeredSTEDimaging. NatureMethods ,19(10),1268–1275.ISSN 15487105.https://doi.org/10.1038/s41592-022-01588-y 3. Almada,P.,Pereira,P.M.,Culley,S.,Caillol,G.,Boroni-Rueda, F.,Dix,C.L.,Charras,G.,Baum,B.,Laine,R.F.,Leterrier,C.,& Henriques,R.(2019).Automatingmultimodalmicroscopywith nanoj-fluidics. Nature Communications ,10(12). ISSN 20411723. https://doi.org/10.1038/s41467-019-09231-9 4. Cortes, C., & Vapnik, V. (1995). Support-vector networks. MachineLearning ,20,273–297. 5. Ronneberger,O., Fischer,P., & Brox,T. (2015). U-net:Convolu- tionalnetworksforbiomedicalimagesegmentation (Vol.9351,pp. 234–241). Springer Verlag. ISBN 9783319245737. https://doi.org/ 10.1007/978-3-319-24574-4_28 6. Goodfellow, I. J., Pouget-Abadie, J., Mirza, M., Xu, B., Warde- Farley,D.,Ozair,S.,Courville,A.,&Bengio,Y.(2014).Generative adversarialnetworks.arXiv:1406.2661. 7. Conrad,C.,Wünsche,A.,Tan,T.H.,Bulkescher,J.,Sieckmann, F.,Verissimo,F.,Edelstein,A.,Walter,T.,Liebel,U.,Pepperkok, R.,&Ellenberg,J.(2011).Micropilot:Automationoffluorescence microscopy-basedimagingforsystemsbiology. NatureMethods , 8(3),246–249.ISSN15487091. https://doi.org/10.1038/nmeth.1558 8. Fox, Z. R., Fletcher, S., Fraisse, A., Aditya, C., Sosa-Carrillo, S., Petit, J., Gilles, S., Bertaux, F., Ruess, J., & Batt, G. (2022). EnablingreactivemicroscopywithMicroMator. NatureCommu- nications,13.ISSN20411723. https://doi.org/10.1038/s41467-022- 29888-z 9. Pinkard,H.,Baghdassarian,H.,Mujal,A.,Roberts,E.,Hu,K.H., Friedman,D.H.,Malenica,I.,Shagam,T.,Fries,A.,Corbin,K., Krummel,M.F.,&Waller,L.(2021).Learnedadaptivemultipho- ton illumination microscopy for large-scale immune response imaging.NatureCommunications ,12.ISSN20411723. https://doi. org/10.1038/s41467-021-22246-5 10. Mahecic, D., Stepp, W. L., Zhang, C., Griffié, J., Weigert, M., & Manley,S.(2022).Event-drivenacquisitionforcontent-enriched microscopy. Nature Methods ,19(10), 1262–1267. ISSN 15487105. https://doi.org/10.1038/s41592-022-01589-x 11. Bouchard,C.,Wiesner,T.,Deschênes,A.,Bilodeau,A.,Turcotte, B.,Gagné,C.,&Lavoie-Cardinal,F.(2023).Resolutionenhance- mentwithatask-assistedGANtoguideopticalnanoscopyimage analysis and acquisition. Nature Machine Intelligence ,5(8), 830–844. ISSN 25225839. https://doi.org/10.1038/s42256-023- 00689-3 Howtocitethisarticle: Morgado,L., Gómez-de-Mariscal,E.,Heil,H.S.,&Henriques,R. (2024).Theriseofdata-drivenmicroscopypowered bymachinelearning. JournalofMicroscopy ,1–8. https://doi.org/10.1111/jmi.13282 13652818, 0, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/jmi.13282 by Cochrane Portugal, Wiley Online Library on [30/05/2024]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License