This, the company claims happens without throttling performance or the need for internal fans. The 8cx 5G platform is expected to deliver peak download speeds of up to 2.5 Gbps and are designed to offer much-improved battery life when compared to conventional laptop CPUs. Qualcomm's new Snapdragon 8cx 5G compute platform is the company's most powerful SoC along with their fastest-ever modem. While not a lot of information was revealed this is still work in progress, the two companies are calling the laptop 'Project Limitless.' Qualcomm, a company better known for its chipsets on smartphones, is terming this laptop as “the world’s first 7 nm platform purpose-built for PCs that offers 5G connectivity.” Stable Beluga models are optimized for “harmlessness” therefore, the new names fit better with the models.Qualcomm in partnership with Lenovo has just taken the wraps off the world’s first 5G PC powered by Qualcomm’s Snapdragon 8cx SoC. There were multiple reasons for the name change, the most notable being that belugas are gentler animals, unlike the fierce Orca (commonly known as killer whales). These models were renamed from their internal code-name FreeWilly (a homage to the movies that some of us remember fondly), referring to the Orca paper. Stay tuned for more exciting developments, and begin exploring the incredible potential of Stable Beluga today! We would like to express our sincere gratitude to our passionate team of researchers, engineers, and collaborators, whose remarkable efforts and dedication have enabled us to reach this significant milestone. We are excited about the endless possibilities these models will bring to the AI community and the new applications they will inspire. They both significantly advance research, enhance natural language understanding and enable complex tasks. Stable Beluga 1 and Stable Beluga 2 set a new standard in the field of open access Large Language Models. These Stable Beluga results were evaluated by Stability AI researchers and independently reproduced by Hugging Face on July 21st, 2023, and published in their leaderboard.Īs of July 27th, 2023, Stable Beluga 2 is the very best model (#1) on the leaderboard, and Stable Beluga 1 is #4: To internally evaluate these models, we used EleutherAI’s lm-eval-harness, to which we added AGIEval.īoth Stable Beluga models excel in many areas, including intricate reasoning, understanding linguistic subtleties, and answering complex questions related to specialized domains, e.g. To ensure fair comparisons, we carefully filtered these datasets and removed examples that originated from evaluation benchmarks.** Despite training on one-tenth the sample size of the original Orca paper (significantly reducing the cost and carbon footprint of training the model compared to the original paper), the resulting Stable Beluga models demonstrate exceptional performance across various benchmarks – validating our approach to synthetically generated datasets. Our variant of the dataset, containing 600,000 data points (roughly 10% of the dataset size the original Orca paper used), was created synthetically using high-quality instructions from the following datasets created by Enrico Shippole: The training for the Stable Beluga models was directly inspired by the methodology pioneered by Microsoft in its paper: "Orca: Progressive Learning from Complex Explanation Traces of GPT-4.” While our data generation process is similar, we differ in our data sources. Similarly, Stable Beluga 2 leverages the LLaMA 2 70B foundation model to achieve industry-leading performance.īoth models are research experiments and are released to foster open research under a non-commercial license.* While we have conducted internal red-teaming to ensure the model remains polite and harmless, we welcome the community's feedback and help in further red-teaming. Stable Beluga 1 leverages the original LLaMA 65B foundation model and was carefully fine-tuned with a new synthetically-generated dataset using Supervised Fine-Tune (SFT) in standard Alpaca format. Both models demonstrate exceptional reasoning ability across varied benchmarks. Stability AI and its CarperAI lab proudly announce Stable Beluga 1 and its successor Stable Beluga 2 (formerly codenamed FreeWilly), two powerful new, open access, Large Language Models (LLMs).
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |