bioessays_snapshot

Pipeline

Data architecture, artificial intelligence, automated processing, containerization, and clusters orchestration ease the transition from data acquisition to insights in medium‐to‐large datasets.

In this journal paper, Sierra’s researchers, in collaboration with PNP Research Corporation (MA, USA) and the Wellman Center for Photomedicine (Boston, MA, USA), discuss how to handle large microscopy datasets and process them automatically using artificial intelligence and modern IT software, such as containerization and kubernets. You can find the full open-access article at https://onlinelibrary.wiley.com/doi/full/10.1002/bies.201900004

The abstract reads: «Here, a streamlined, scalable, laboratory approach is discussed that enables medium‐to‐large dataset analysis. The presented approach combines data management, artificial intelligence, containerization, cluster orchestration, and quality control in a unified analytic pipeline. The unique combination of these individual building blocks creates a new and powerful analysis approach that can readily be applied to medium‐to‐large datasets by researchers to accelerate the pace of research. The proposed framework is applied to a project that counts the number of plasmonic nanoparticles bound to peripheral blood mononuclear cells in dark‐field microscopy images. By using the techniques presented in this article, the images are automatically processed overnight, without user interaction, streamlining the path from experiment to conclusions. «

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos requeridos están marcados *

Publicar comentario