The generational improvements in lab instruments and molecular assays have ushered in the era of scaled biology. Scientists have more options than ever before and the downstream result is more complex and dynamic experimental design. Modern biotechs today emphasize speed of lab experimentation along with the operational rigor to run the orchestration of parallel protocols. Meanwhile, data is quickly moving from a cost center to a key value driver and defensive moat in life science. Research is increasingly driven not by the readout of any particular experiment, but rather the cumulative and compounding effects of downstream analysis on mountains of experimental data. Alongside this data transition comes the absolute need for quality, interoperability, reproducibility and consistency of data outputs. This industry backdrop serves as a long term, enduring tailwind for lab automation solutions. One of our industry conversations about this space put it simply: AI-ready data does not exist without lab automation. The modern lab is a data factory and is in need of an integrated automation suite that translates experimental design into clean data streams ready for pipelining and analysis.

The existing automation solutions in industry fall into two general camps. The first is “single instrument automation”, driven by providers (Tecan, Dynamic Devices, Hamilton) that focus on building devices that allow for bulk plate processing of a single workflow step (e.g. liquid handling). These solutions are widely adopted in the industry, but have limited impact on lab throughput, overhead, and data quality because the scope of their operations is fixed to a single step in a lengthy multi-step protocol. The second camp is the “custom integrator” option, with vendors (ThermoFisher, HighRes, Biosero) that design entirely custom hardware and software configured to a particular automation need. These systems can be robust, but are difficult to both implement and reconfigure. Integrators can take 9 to 12 months to properly scope an automation solution, leading to onerous upfront setup costs and delays.

Automata has met the industry head-on in this unaddressed gap and built the lab automation solution for the next generation of life science. Automata’s LINQ platform addresses key limitations of current solutions and brings life science forward into a paradigm of “Total Lab Automation”: software controlled, end-to-end multi-instrument coverage, modular hardware, programmable, and scalable. Automata allows scientists to work in high-level experimental design in systems like Benchling, while digitizing these experiments into protocols that can be run programmatically in the lab. In stark contrast to the popular integrator-led solutions, Automata’s LINQ system is priced at a fraction of the install price and is made for the fast changing environment of the modern biotech.

Hardware

The hardware design of LINQ centers around a modular system with three features per unit: a bench to store and integrate into lab instruments, a magnetic shuttle based transport layer to move plates, and a robotic arm that is used for instrument tending / plate loading and unloading. The unit bench design mirrors the existing footprint of lab workbenches and makes the shift to seamless as each LINQ bench can be swapped in as a direct replacement to a manual bench.

Any individual bench unit is able to connect to additional units to create customizable configurations that branch out from a central design and product standard. This allows Automata to minimize product complexity while serving customers across a wide range of use cases.

Software

The role of software at Automata is to bridge between science and lab execution. The surface area between Automata and scientific design is two fold: at the point of design, and after the point of execution. The LINQ Cloud system integrates to experimental design with a suite of tools built to manage protocols. In addition, Automata pulls data off of instruments to handle data validation and pipelining - generating clean experimental data that is ready for downstream analytics and pipelining.

The major tasks to be managed by Automata software are:

  • Protocol ingestion and design: Client experimental designs need to be encoded into machine instructions that the Automata system can execute. Customers want the ability to manage digital SOPs and also make changes directly to adjust protocols in real time. Some customers want the ability to code protocols and integrate directly with Git repositories of automation scripts, while others prefer to expose scientists to a no-code visual interface for protocol editing.
  • Lab scheduling: Protocols need to be loaded onto Automata instruments and pre-run screens need to be run before experiments can be executed. There are interdependencies to ongoing lab operations as the entire Automata system needs to be set up correctly before a protocol can be run.
  • Protocol orchestration: Most labs have multiple protocols that are run within the same time frame, often sharing instrument availability and experience manual interrupts that pause protocols mid run. Customers will greatly benefit from Automata’s ability to manage experiment orchestration across the entire lab.
  • Instrument connection, operations: Automata translates digital SOPs into instructions that are set for each integrated lab instrument. The LINQ platform serves as an automation wrapper around a lab instrument - using robots to oversee machine tending and software integrations to control instrument run cycles.
  • Raw data collection: Given the proximity to each experimental run, Automata is in the best position to capture post run data coming off of the instruments and manage the early data pipelining into LIMS systems and other analytical workflows.

Automata launched the LINQ system a couple of years ago and has scaled tremendously since then. The product is already deployed in partnership with industry leaders across Pharma, biotech, diagnostics, and CROs. We are excited to partner with the entire team to build on this early commercial success and create the leading automation provider for the next generation.

https://www.linkedin.com/in/nanli/
https://www.linkedin.com/in/nanli/
Nan Li
Founder & Managing Partner
Nan has a mixed technology, investing and entrepreneurial background and has been an active early stage venture investor for over a decade. He is interested in the application of cutting edge technologies including computer vision, AI/ML, NLP, robotics, and automation systems.
Meet the Team

Sign up for dispatches from the frontier of what’s possible.

Thank you! Your submission has been received.
Oops! Something went wrong while submitting the form.