Artwork

Content provided by Daniel Bogdanoff and Mike Hoffman. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Daniel Bogdanoff and Mike Hoffman or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Artificial Intelligence and Dispersed Computing – #2

33:23
 
Share
 

Manage episode 179456049 series 1443495
Content provided by Daniel Bogdanoff and Mike Hoffman. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Daniel Bogdanoff and Mike Hoffman or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

How is the world of processors changing, and what does it mean for the future of AI? Daniel Bogdanoff and Brig Asay sit down to talk about it.

https://eestalktech.com/wp-content/uploads/2017/04/ai-and-dispersed-computing.mp3

Video (YouTube):

Discussion overview:

Intro 00:00
Intel acquisition of Altera 00:45
What does it mean for Intel to buy an FPGA company?
What is “dispersed computing” 1:17
Microprocessors used to handle everything
Then, GPUs became integrated 1:45
Offloading computing from a microprocessor 2:02
One option is to use an FPGA to share computing 2:10
ASIC vs FPGA 2:15
ASICs aren’t flexible 2:45
FPGAs give more flexibility than an ASIC 3:03
We use both FPGAs and ASICs in our instruments 3:25

Parallel vs serial buses 3:35

PCIe is x16, other tech going well past 2 and 4 lanes 4:00
This is helpful, but it adds a lot of design complexity
We’re starting to see 5:00
PCIe, USB, SerDes used to dominate but now we’re seeing some other technologies
like Generation Z and CCIX (Cache Coherent Interconnect for Accelerators) 6:00
Makes designs faster to market and easier to debug
Generation Z (Gen-Z) 6:25
Generation Z and CCIX build on PCIe technology
Why are these technologies coming out? 7:00
PCIe takes a lot of work to implement 7:35
So these technologies are less stringent 8:00
and are more open 8:15
We see a lot of PCIe Gen 2 that will start to be replaced by Gen-Z or CCIX type buses internally 8:30
How does the microprocessor connect to other chips in the design? 9:05
That’s the biggest opportunity for speed increases
Thunderbolt has been around for a while 9:45
But, Thunderbolt is finally taking off 10:00
It used to be an internal bus, but now we’re starting to see it externally on consumer devices

What are the next major tasks that will be offloaded? 10:25
AI, machines learning from themselves 11:00
“If true artificial intelligence happens, there’s no way a microprocessor can do it all” 11:10
https://en.wikipedia.org/wiki/Big_data

Big data is huge, and that requires a lot of processing and computing 11:32
A processor and a server won’t be able to do it alone 11:50
Is this because there’s too much data? (it’s two-fold) 12:10
1. There’s tons of data 12:45
2. We want to know the answer right away
FPGAs/ASICs are currently doing a “filtering” of data which then feeds into a central processor 13:15
Right now, FPGAs are handling very specific tasks 13:56
Intel acquires Altera, which is a good indicator of where the industry is going 14:15
The FPGA is going to get smarter and smarter 14:50
Are FPGAs too slow? 15:15
What do designers need their FPGAs to do? 16:10
Companies creating FPGAs know that they have to have higher performance at lower cost 16:30
NVIDIA, Google, Facebook are all releasing their own chips
FPGA part costs will likely drop in the next 5 years as a result 17:20
Is there a blend of FPGAs and ASICs? 18:00
We’re seeing FPGAs starting to be implemented on data centers and servers 18:15
Using FPGAs instead of ASICs there for their flexibility

Servers lead the PC/consumer market in technology 18:45

Server loads are an order of magnitude greater than PC loads
Hyperscaling 19:40
Historically, you had storage, servers, and routers all separate. Now, they’re getting smarter with resource allocation

Localized vs remote dispersed computing 21:20

All the data has to go somewhere, there’s not a lot of point to point
Latency is becoming more of an issue 22:00
Is processor technology plateauing? 22:30
Consumers generally don’t need a lot more processing power as of today, but servers do
Are multiple core processors a harbinger of FPGAs taking on more tasks? 23:55

AI is becoming more and more important 25:05
There’s nothing more debated than artificial intelligence 25:40
We’re using it in a minimalist way 26:00
A “large tech company” had an AI go on Twitter and it didn’t work out very well 26:35
What is it going to take to make AI something that is integral to our daily life? 27:05
For data centers, AI is going to play a role in adjusting to the flux of data 27:50
What’s the difference between artificial intelligence and analytics 28:15

AI makes the decisions, analytics is just a flow of information
The 2016 USA presidential election is an good example of analytics vs AI 28:53
AI has been in Science Fiction for a long time 29:55
AI brings a lot of ethical discussions, but we don’t have time to talk about them 30:00
Predictions (Luddites, elections, and “the common man”) 30:45
AI and self driving cars 31:30

  continue reading

38 episodes

Artwork
iconShare
 
Manage episode 179456049 series 1443495
Content provided by Daniel Bogdanoff and Mike Hoffman. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Daniel Bogdanoff and Mike Hoffman or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

How is the world of processors changing, and what does it mean for the future of AI? Daniel Bogdanoff and Brig Asay sit down to talk about it.

https://eestalktech.com/wp-content/uploads/2017/04/ai-and-dispersed-computing.mp3

Video (YouTube):

Discussion overview:

Intro 00:00
Intel acquisition of Altera 00:45
What does it mean for Intel to buy an FPGA company?
What is “dispersed computing” 1:17
Microprocessors used to handle everything
Then, GPUs became integrated 1:45
Offloading computing from a microprocessor 2:02
One option is to use an FPGA to share computing 2:10
ASIC vs FPGA 2:15
ASICs aren’t flexible 2:45
FPGAs give more flexibility than an ASIC 3:03
We use both FPGAs and ASICs in our instruments 3:25

Parallel vs serial buses 3:35

PCIe is x16, other tech going well past 2 and 4 lanes 4:00
This is helpful, but it adds a lot of design complexity
We’re starting to see 5:00
PCIe, USB, SerDes used to dominate but now we’re seeing some other technologies
like Generation Z and CCIX (Cache Coherent Interconnect for Accelerators) 6:00
Makes designs faster to market and easier to debug
Generation Z (Gen-Z) 6:25
Generation Z and CCIX build on PCIe technology
Why are these technologies coming out? 7:00
PCIe takes a lot of work to implement 7:35
So these technologies are less stringent 8:00
and are more open 8:15
We see a lot of PCIe Gen 2 that will start to be replaced by Gen-Z or CCIX type buses internally 8:30
How does the microprocessor connect to other chips in the design? 9:05
That’s the biggest opportunity for speed increases
Thunderbolt has been around for a while 9:45
But, Thunderbolt is finally taking off 10:00
It used to be an internal bus, but now we’re starting to see it externally on consumer devices

What are the next major tasks that will be offloaded? 10:25
AI, machines learning from themselves 11:00
“If true artificial intelligence happens, there’s no way a microprocessor can do it all” 11:10
https://en.wikipedia.org/wiki/Big_data

Big data is huge, and that requires a lot of processing and computing 11:32
A processor and a server won’t be able to do it alone 11:50
Is this because there’s too much data? (it’s two-fold) 12:10
1. There’s tons of data 12:45
2. We want to know the answer right away
FPGAs/ASICs are currently doing a “filtering” of data which then feeds into a central processor 13:15
Right now, FPGAs are handling very specific tasks 13:56
Intel acquires Altera, which is a good indicator of where the industry is going 14:15
The FPGA is going to get smarter and smarter 14:50
Are FPGAs too slow? 15:15
What do designers need their FPGAs to do? 16:10
Companies creating FPGAs know that they have to have higher performance at lower cost 16:30
NVIDIA, Google, Facebook are all releasing their own chips
FPGA part costs will likely drop in the next 5 years as a result 17:20
Is there a blend of FPGAs and ASICs? 18:00
We’re seeing FPGAs starting to be implemented on data centers and servers 18:15
Using FPGAs instead of ASICs there for their flexibility

Servers lead the PC/consumer market in technology 18:45

Server loads are an order of magnitude greater than PC loads
Hyperscaling 19:40
Historically, you had storage, servers, and routers all separate. Now, they’re getting smarter with resource allocation

Localized vs remote dispersed computing 21:20

All the data has to go somewhere, there’s not a lot of point to point
Latency is becoming more of an issue 22:00
Is processor technology plateauing? 22:30
Consumers generally don’t need a lot more processing power as of today, but servers do
Are multiple core processors a harbinger of FPGAs taking on more tasks? 23:55

AI is becoming more and more important 25:05
There’s nothing more debated than artificial intelligence 25:40
We’re using it in a minimalist way 26:00
A “large tech company” had an AI go on Twitter and it didn’t work out very well 26:35
What is it going to take to make AI something that is integral to our daily life? 27:05
For data centers, AI is going to play a role in adjusting to the flux of data 27:50
What’s the difference between artificial intelligence and analytics 28:15

AI makes the decisions, analytics is just a flow of information
The 2016 USA presidential election is an good example of analytics vs AI 28:53
AI has been in Science Fiction for a long time 29:55
AI brings a lot of ethical discussions, but we don’t have time to talk about them 30:00
Predictions (Luddites, elections, and “the common man”) 30:45
AI and self driving cars 31:30

  continue reading

38 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide