BlogDemoRegisterSign In
Back to Blog
Technology

Building Smarter Solutions with Llama

Posted By:
Lim Ting Hui
Rafael Nicolas Fermin Cota

Are you using Llama 3 or 3.1? Which version? (8B, 70B or 405B)

MetaLearner is leveraging Llama 3.1 across both the 8B and 70B models to power our AI-driven data science workflow

Tell us briefly about your company/organization and the needs you are solving by building with Llama.

MetaLearner is revolutionizing the way users interact with their data. Our platform, powered by Llama 3.1, allows non-technical users to harness the state-of-the-art data science architecture without needing deep technical expertise. By staying updated with academic research, our solution ensures users always have the most advanced capabilities at their fingertips. We are also part of Nvidia Inception Program and collaborate with the National University of Singapore for R&D and talent cultivation.

What was your original introduction to Llama?

Our journey with Llama began with Llama 2, which impressed us with its leap in model performance and ability to efficiently manage and invoke AI workflows. Its strong data privacy framework made it an ideal fit for our needs, and since then, we have continuously evolved our platform alongside Llama’s advancement.

How (if at all) has your use of Llama evolved over time?

We initially adopted Llama to handle orchestration workflows. Over time, we have expanded its role significantly, incorporating Llama into various AI-powered functionalities such as summarization, online search, and even more complex tasks such as text-to-SQL. This evolution has allowed us to create a more dynamic and responsive platform for our users.

How are you using Llama in your work today?

Today, Llama serves as the core intelligence behind our data analytics and data science solutions. By integrating Llama with our proprietary tools, we have built a system where the AI acts like a data analyst in real-time, engaging in natural conversations with users and providing actionable insights seamlessly. It is as if users have a highly-skilled data colleague at their side, making advanced analytics more accessible and interactive than ever.

What challenges did you face while implementing Llama in your work? How did you overcome those obstacles?

In the beginning, integrating Llama as the orchestration engine was a challenge. The model struggled to invoke the correct tools with the proper parameters, leading to some inefficiencies. However, through persistent refinement of both the AI workflow and model orchestration, we optimized the process. As a result, Llama now works smoothly within our systems, delivering accurate outputs with minimal hallucinations.

Explain how you fine-tuned Llama for your use case.

Our primary challenge was training Llama to navigate and utilize our proprietary tools effectively. We approached this in two ways: tightening our AI system for better orchestration and fine-tuning the Llama models. As a startup, we’ve prioritized refining the system itself since this is independent of LLM advancements. In the future, as advancements of foundation models stabilize, we plan to leverage techniques like LoRA to further enhance model performance and efficiency.

What's the tangible, real-world impact resulting from your use of Llama? Why should our readers be excited and care about this use case?

ERP systems are in dire need of innovation, and they remain crucial for companies across industries. We’ve revolutionized this space by using Generative AI to transform how businesses perform data analysis and forecasting. With Llama, users can now access advanced data science workflows without any technical expertise. This brings cutting-edge AI capabilities to the center of operational decision-making, making businesses more agile, informed, and competitive. It’s a groundbreaking step forward in modernizing enterprise management.

What impact has an open source approach and the efforts of the open source community had on your organization?

Open source has empowered us to compete with tech giants by allowing us to focus on niche applications and infuse our domain expertise into AI. It also ensures that the work we do is truly ours, eliminating concerns around IP ownership and privacy. The collaboration and transparency of the open-source community have been pivotal in our success, enabling innovation that would be impossible in a closed ecosystem.

At a higher level, what impact do you think an open source approach could have on organizations like yours? What about smaller companies and institutions?

Open source is the driving force behind the next generation of innovation. For companies like MetaLearner and smaller organizations, it acts as a powerful enabler, unlocking opportunities to leverage state-of-the-art technologies like Llama without the massive R&D budgets of large corporations. We believe that open-source platforms will inspire innovations across industries, empowering startups and institutions to innovate rapidly and stay competitive.

Have you released any technical artifacts in support of your work (papers, models, datasets, etc.)? If so, please provide link(s) and details here.

While most of our AI workflow remains proprietary to MetaLearner, we actively contribute to the open-source community as part of our commitment to give back. One example is our Rust engine (https://github.com/carlvoller/excel-rs) which converts Python pandas dataframes to Excel reports with a 95% reduction in compute time compared to the leading Python engine. We’ll continue sharing innovations that push the boundaries of performance and efficiency as we progress.

As the Llama ecosystem continues to grow, how do you think your use of it might evolve over time? For instance, are you leveraging Llama 3, or do you plan to do so?

MetaLearner are fully committed to staying at the forefront of industry advancements. On the day of Llama 3.1’s release, we upgraded our AI models from Llama 3 to 3.1 within 30 minutes and immediately rolled it out to our customers. Our modular architecture allows us to quickly integrate new technologies as they emerge, ensuring that we’re always delivering the best AI-driven solutions. As the Llama ecosystem grows, we’ll continue evolving alongside it, leveraging future iterations to provide even more value to our users.

If readers take away one thing from this story, what do you hope that would be and why?

We want readers to challenge the status quo and embrace innovation. We’re entering a new era of productivity where the way users interact with software and data is set to change fundamentally. This is a once-in-a-lifetime opportunity, and platforms like Llama are democratizing access to tools once controlled by big tech, opening the door for everyone to innovate.

Anything else you'd like to share with our readers?

MetaLearner is a young and ambitious company, and we’re always open to conversations about technology, partnerships, clients, and investment. If you’d like to learn more or collaborate with us, don’t hesitate to reach out at jose.lama@metalearner.ai. Check out our latest video here and discover the future of ERP data management with MetaLearner.

Do you have any visuals/creative assets that showcase your work with Llama?

Company Introduction Video: https://www.youtube.com/watch?v=liMbNL1B0wg

Meta’s Blog on MetaLearner’s Use of Llama AI for Data Science Transformation: https://ai.meta.com/blog/meta-learner-llama-data-science/

MetaLearner Tutorial on High Performance Search and Summary with Llama 3.1: https://www.metalearner.ai/blog/metalearner-research-web-search-optimization

Linkedin Posts on MetaLearner: https://www.linkedin.com/pulse/metalearner-suggested-posts-rafael-nicolas-fermin-cota-savee/

Empower ERP users with chat-based AI forecasting without the need for technical expertise.Copyright © 2024 - All Rights Reserved
LINKSSupportRegister your interestBlog
LEGALTerms of ServicePrivacy Policy