Artwork

Content provided by AI-Podden with Ather Gattami and Ather Gattami. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by AI-Podden with Ather Gattami and Ather Gattami or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

AI-podden News — 29 Nov

28:44
 
Share
 

Manage episode 386449415 series 1993932
Content provided by AI-Podden with Ather Gattami and Ather Gattami. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by AI-Podden with Ather Gattami and Ather Gattami or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Let’s start with the company behind chatGPT, OpenAI. No one has missed the last couple of weeks' happenings at OpenAI, where the CEO and cofounder Sam Altman was fired on Friday the 17th and then on Monday evening reinstated as CEO. There have been a lot of rumours of why he was fired in the first place but I think we need to focus on something different. Usually, when you kick out your CEO and cofounder, your investors get a heads-up at the very least. In the case of Open AI, the investors include Microsoft, Khosla Ventures, Andreessen Horowitz, Founders Fund and Sequoia — these are big firms. All of them were kept in the dark. The reason for this is that none of these investors sits on the OpenAI board of directors since the company has a different structure — it is run like a non-profit company. I believe this was set up as a part of safety measures since OpenAI is working on AGI (artificial general intelligence) and if the CEO diverged from the safest path, the board could fire him. So after that TDLR, is this a good way to govern an AI company? Amazon’s new 2 trillion parameters LLM Olympus (double what GPT4 has) puts it in competition with OpenAI, Meta, Anthropic, Google, and others. Earlier this month, I read in Reuters that Amazon is investing millions in training an ambitious large language model (LLMs), hoping it could rival OpenAI, Google and Meta. The model, codenamed “Olympus”, has 2 trillion parameters, sources said, which could make it one of the largest models being trained. OpenAI's GPT-4 model is reported to have one trillion parameters. So, it seems the more parameters the better, however, then I read about this Japanese LLM by NEC, which has reduced the size to “only” 13 billion parameters. This LLM is, which is said to achieve high performance while reducing the number of parameters through unique innovations. This not only reduces power consumption but also enables operation in cloud and on-premises environments due to its lightweight and high-speed. There is this understanding that the better the LLM is at language, the more persuasive it can be and also more innovative. Is this the reason why there is so much work being done on having LLMs taught on specific languages? Samsung AI race over Apple – how will the AI development be visible in our smartphones? https://www.theverge.com/2023/11/8/23953198/samsung-galaxy-ai-live-translate-call Some say that AI-powered features seem like they’re becoming the next battleground for smartphone makers. And Samsung has come out this month with a feature that use artificial intelligence to translate phone calls in real-time, it is calling it “AI Live Translate Call,” and will be built into the company’s native phone app. Samsung says “audio and text translations will appear in real-time as you speak”. But Samsung is not alone, Google, for example, has a suite of AI-powered tools to help you edit and improve photos with its Pixel 8 lineup. Apple is reportedly spending a lot of money every day to train AI, and I have to imagine all that investment will show up in some AI-powered features for iPhones. So, what will this mean for our smartphones?

Don't forget to subscribe and follow us on Linkedin.

  continue reading

142 episodes

Artwork

AI-podden News — 29 Nov

AI-podden

62 subscribers

published

iconShare
 
Manage episode 386449415 series 1993932
Content provided by AI-Podden with Ather Gattami and Ather Gattami. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by AI-Podden with Ather Gattami and Ather Gattami or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Let’s start with the company behind chatGPT, OpenAI. No one has missed the last couple of weeks' happenings at OpenAI, where the CEO and cofounder Sam Altman was fired on Friday the 17th and then on Monday evening reinstated as CEO. There have been a lot of rumours of why he was fired in the first place but I think we need to focus on something different. Usually, when you kick out your CEO and cofounder, your investors get a heads-up at the very least. In the case of Open AI, the investors include Microsoft, Khosla Ventures, Andreessen Horowitz, Founders Fund and Sequoia — these are big firms. All of them were kept in the dark. The reason for this is that none of these investors sits on the OpenAI board of directors since the company has a different structure — it is run like a non-profit company. I believe this was set up as a part of safety measures since OpenAI is working on AGI (artificial general intelligence) and if the CEO diverged from the safest path, the board could fire him. So after that TDLR, is this a good way to govern an AI company? Amazon’s new 2 trillion parameters LLM Olympus (double what GPT4 has) puts it in competition with OpenAI, Meta, Anthropic, Google, and others. Earlier this month, I read in Reuters that Amazon is investing millions in training an ambitious large language model (LLMs), hoping it could rival OpenAI, Google and Meta. The model, codenamed “Olympus”, has 2 trillion parameters, sources said, which could make it one of the largest models being trained. OpenAI's GPT-4 model is reported to have one trillion parameters. So, it seems the more parameters the better, however, then I read about this Japanese LLM by NEC, which has reduced the size to “only” 13 billion parameters. This LLM is, which is said to achieve high performance while reducing the number of parameters through unique innovations. This not only reduces power consumption but also enables operation in cloud and on-premises environments due to its lightweight and high-speed. There is this understanding that the better the LLM is at language, the more persuasive it can be and also more innovative. Is this the reason why there is so much work being done on having LLMs taught on specific languages? Samsung AI race over Apple – how will the AI development be visible in our smartphones? https://www.theverge.com/2023/11/8/23953198/samsung-galaxy-ai-live-translate-call Some say that AI-powered features seem like they’re becoming the next battleground for smartphone makers. And Samsung has come out this month with a feature that use artificial intelligence to translate phone calls in real-time, it is calling it “AI Live Translate Call,” and will be built into the company’s native phone app. Samsung says “audio and text translations will appear in real-time as you speak”. But Samsung is not alone, Google, for example, has a suite of AI-powered tools to help you edit and improve photos with its Pixel 8 lineup. Apple is reportedly spending a lot of money every day to train AI, and I have to imagine all that investment will show up in some AI-powered features for iPhones. So, what will this mean for our smartphones?

Don't forget to subscribe and follow us on Linkedin.

  continue reading

142 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide