Arm wants to improve the brains in our devices. The chip designer – whose architectures make up 99% of smartphones – imagines that AI will bring a new wave of breakthroughs to our mobile phones.
The company outlined this plan after releasing Llama 3.2 – Meta's first open source model that handles both images and text. arm said the models run “seamlessly” on its computing platforms.
The smaller, text-based LLMs – Llama 3.2 1B and 3B – are optimized for Arm-based mobile chips. This allows the models to deliver faster user experiences on smartphones. Processing more AI at the edge can also save energy and costs.
These improvements provide new opportunities for scaling. By increasing the efficiency of LLMs, Arm can run more AI directly on smartphones. For developers, this could lead to faster innovation.
Back to business with the TNW conference
It's time to set your goals and budget for the coming year! Get 2 tickets now and save €€€ on this Super EarlyBird fare. Limited offer.
Arm expects this to create countless new mobile apps.
LLMs perform tasks on your behalf by knowing your location, schedule, and preferences. Routine tasks are automated and recommendations are personalized on the device. Your phone will evolve from a command and control tool to a “proactive assistant.”
Arm wants to accelerate this development. The UK-based company wants its CPUs to be “the foundation for AI everywhere.”
Arm has an ambitious timeline for this strategy. The chip giant wants more than that by 2025 100 billion Arm-based devices are said to be “AI-enabled”.
Comments are closed, but trackbacks and pingbacks are open.