• unwind ai
  • Posts
  • Last Week in AI - A Weekly Unwind

Last Week in AI - A Weekly Unwind

From 18-Aug-2024 to 24-Aug-2024

It was yet another thrilling week in the AI field with advancements that further extend the limits of what can be achieved with AI.

Here are 10 AI breakthroughs that you can’t afford to miss 🧵👇

Neo4j released a new open-source GraphRAG Ecosystem Tools for developers to build more reliable and explainable GenAI applications. GraphRAG seamlessly integrates structured and semi-structured data, offering a more comprehensive and reliable foundation for your GenAI endeavors. It works with PDFs, Word documents, YouTube transcripts, Wikipedia pages, and many other kinds of unstructured text

Nous Research released fine-tuned versions of Llama 3.1 models, Hermes 3. These new models, available in 8B, 70B, and 405B parameter sizes, significantly improve in roleplaying, agentic tasks, and multi-turn conversations. Hermes 3 models are designed with a strong emphasis on user alignment and steerability, allowing for greater control and customization compared to commercially available models. The enhanced function calling, structured output capabilities, and improved code generation are particularly useful.

Hugging Face has released an in-depth tutorial on building and training your own robot using neural network. The tutorial provides a comprehensive guide, from assembling the robotic arm (costing approximately $400-$500) to training its AI brain (requiring a few hours on a standard laptop) that allows it to perform tasks autonomously. It uses techniques similar to how LLMs are trained for text, but adapted for robotics.

Intel introduced RAG Foundry, a new opensource framework that provides a single workflow for data creation, training, inference, and evaluation of RAG techniques. Its modular design and customizable features make it an ideal choice for both researchers and practitioners seeking to build robust and adaptable RAG solutions.

Fine-tuning for GPT-4o is now live, allowing you to tailor the model with custom datasets to get higher performance at a lower cost for your specific use cases. This feature comes with a generous offer: 1 million free training tokens per day for every organization until September 23. Imagine achieving state-of-the-art results with a model that understands your domain-specific language and nuances!

Microsoft has released its next series of opensource small language models. The new Phi 3.5 series features Phi-3.5-mini, Phi-3.5-MoE, and Phi-3.5-vision, designed to be lightweight and powerful, focused on high-quality reasoning, and supporting 128K token context length. They are trained on a mix of synthetic and filtered public web data, further tuned for instruction following and safety.

Your unstructured data is a goldmine but it doesn’t have to be an archaeological dig. LlamaIndex has released LlamaExtract, a new managed service that lets you easily extract structured data from your unstructured documents. It offers both schema inference from your documents and the ability to extract values based on a provided schema. You can access LlamaExtract through a user-friendly UI or via an API.

Ideogram has launched Ideogram 2.0, a free text-to-image model with five distinct styles, along with an iOS app, a beta API, and Ideogram Search. The model is rated better than Flux Pro and DALL·E 3 by human evaluators.

Running AI locally is gaining traction, but setting up the infrastructure can be a challenge. n8n released a self-hosted AI starter kit to make the process easier and more accessible. This starter kit provides a pre-configured Docker Compose template, making it a breeze to get your AI projects up and running on your own hardware. The kit combines n8n's powerful low-code platform with the flexibility of local AI tools so you can create sophisticated AI agents, automate tasks, and build custom workflows

AI21 Labs released the Jamba 1.5 family of open models – Jamba 1.5 Mini and Jamba 1.5 Large. These models are built on a new architecture called SSM-Transformer, which combines transformers' strengths with Mamba's efficiency, resulting in superior long-context handling, speed, and quality. The Jamba 1.5 models are available for immediate download on Hugging Face and will soon be available on popular frameworks like LangChain and LlamaIndex.

Which of the above AI development you are most excited about and why?

Tell us in the comments below ⬇️

That’s all for today 👋

Stay tuned for another week of innovation and discovery as AI continues to evolve at a staggering pace. Don’t miss out on the developments – join us next week for more insights into the AI revolution!

Click on the subscribe button and be part of the future, today!

📣 Spread the Word: Think your friends and colleagues should be in the know? Share Unwind AI and let them join this exciting adventure into the world of AI. Sharing knowledge is the first step towards innovation!

🔗 Stay Connected: Follow us for updates, sneak peeks, and more. Your journey into the future of AI starts here!

Shubham Saboo - Twitter | LinkedIn

Unwind AI - Twitter | LinkedIn | Instagram | Facebook

Reply

or to participate.