Generative AI, a term often associated with generating images, music, speech, code, video, or text, has seen significant evolution over the past decade. The recent advancements, particularly the ...
Several prominent foundation models are employed to enhance our understanding of high-throughput biological data, followed by a discussion on the application of prediction and generation models across ...
When Liquid AI, a startup founded by MIT computer scientists back in 2023, introduced its Liquid Foundation Models series 2 (LFM2) in July 2025, the pitch was straightforward: deliver the fastest ...
For patients taking medications that don't work as expected or pharmaceutical companies struggling with clinical trial failures, MetaOmics-10T represents a new starting point.
Researchers at Nvidia have developed a new technique that flips the script on how large language models (LLMs) learn to reason. The method, called reinforcement learning pre-training (RLP), integrates ...
The reports of the death of pre-training could have been greatly exaggerated. In a recent appearance on the Dwarkesh podcast, ...