Thermo Fisher Scientific

Your educational resource for biopharma, pharma, environmental, food and agriculture, industrial, and clinical labs

  • Categories
    • Advancing Materials
    • Advancing Mining
    • AnalyteGuru
    • Analyzing Metals
    • Ask a Scientist
    • Behind the Bench
    • Biotech at Scale
    • Clinical Conversations
    • Examining Food
    • Identifying Threats
    • Illuminating Semiconductors
    • Life in Atomic Resolution
    • Life in the Lab
    • OEMpowered
    • The Connected Lab
  • About Us
  • Contact
Accelerating ScienceAnalyteGuru / Data Management Solutions / Moving Beyond Data Hosting to Cloud-Hosted Scientific Applications

Moving Beyond Data Hosting to Cloud-Hosted Scientific Applications

By Derek Kern, Principle Software Architect, Thermo Fisher Scientific 02.15.2024

Cloud systems have encroached upon everything, from our phones to automobiles to manufacturing systems and beyond. It’s only expected that they reach into the lab to enable us to share valuable instruments and data around the globe. The resulting possibility for scientific collaboration, alone, is endless.

However, cloud-hosted systems do not magically improve the application experience. Without proper engineering, the performance of cloud-hosted applications can suffer and lead to poor user adoption. Thermo Fisher Scientific is working to move the needle toward user friendly cloud applications for the lab, so we are focused on engineering our solution for performance.

Improving cloud-hosted lab application performance

Each cloud provider has several regions in their network; for example, AWS has four Regional Centers in specific locations like Virginia and California. These regions are not always close to labs and intended end users. Users not near these Regional Centers can experience latency and poor performance.

CDNs

Thankfully, cloud providers offer services to help meet these challenges, such as Content Delivery Networks (CDN).  CDN services for the major providers include AWS CloudFront, Azure CDN, and Google Cloud CDN. They can be used to strategically cache data in locations closer to end users and reduce file retrieval time. For applications with a lot of low-entropy data (e.g., instrument data files, images, media, etc.), CDNs can be an effective way to mitigate file latency and benefit wider application performance by reducing the traffic managed by centralized, regional infrastructure.

Local Zones

While CDNs help to accelerate access to more static data file types, cloud providers also offer the ability to choose closer “Local Zones” to support other parts of an application.  For example, AWS Local Zones can be utilized to host processing resources and less static data with tools such as Kubernetes clusters, virtual machines, and block storage, in a variety of metropolitan areas outside of a specific regional centers, e.g., Denver for AWS. In this way, Local Zones offer ultra-low latency interaction with processing resources and data.

On-premises

For cases where distributing storage or processing across different Regional Centers and Local Zones is insufficient, there are options for bringing cloud services on-premises, such as AWS Outposts, Azure Stack, and Google Anthos.  These options allow software to benefit from on-prem cloud resources and deliver the full package of ultra-low latency, local data storage, and local data processing.

Conclusion

We realize there is more to software design than ensuring applications can be hosted “in the cloud.” Labs, instruments, and users must be accommodated wherever they are located and offered a collaborative environment built to share data with a global userbase. To meet these challenges, Thermo Fisher is thinking about software differently.  Discover the new Ardia Platform and connect with us to learn more about how it’s different.  

Related information

Whitepaper: Laboratory Digital Transformation: An IT Perspective

Derek Kern

Derek Kern worked designing, building, and troubleshooting enterprise systems for geospatial data and applications for most of his career before joining Thermo Fisher Scientific. Since joining, he has been heavily involved in the effort to design the Ardia Platform for enterprise customers, bringing the cloud into the lab. He has extensive experience in system and software engineering as well as cloud architecture.
Identify Your Top Pain Points in the Analysis of Food and Environmental Matrices  
Driving Innovation: Exploring the Parallels Between Electric Cars and Cutting-Edge Ion Chromatography Systems

Privacy StatementTerms & ConditionsLocationsSitemap

© 2025 Thermo Fisher Scientific. All Rights Reserved.

Talk to us

Notifications