SuzukiSavage.com
/cgi-bin/YaBB.pl
General Category >> The Cafe >> ARM AI and Object Detection modules now available
/cgi-bin/YaBB.pl?num=1518704788

Message started by Oldfeller on 02/15/18 at 06:26:28

Title: ARM AI and Object Detection modules now available
Post by Oldfeller on 02/15/18 at 06:26:28

 
https://liliputing.com/2018/02/arms-project-trillium-brings-device-machine-learning-smartphones-speakers-cars.html

AI is coming along into the generic acceptance phase starting with stuff you can buy later on this year.

Chip designer ARM is unveiling two new categories of processors designed to bring machine learning to low-power devices without the need to phone into a cloud server.

The new ARM ML processor is a new type of chip designed for on-device machine learning, while the ARM OD processor is designer specifically for one type of machine learning: object detection.

They’re both part of a larger initiative called Project Trillium that includes hardware and software to bring AI features to devices with ARM chips including smartphone, smart speakers, security cameras, self-driving cars, and data centers.

ARM is describing Project Trillium as “a new suite of ARM IP” or intellectual property, because it’s not just about new chips: there are already devices with ARM-based chips that use machine learning including security cameras and other Internet of Things gadgets. ARM says device makers can continue to use ARM Cortex-M chips or other low-power processors.

But the new hardware seems pretty intriguing.

The company says its new ARM ML processor can handle 4.6 trillion operations per second while using less than 2 watts while the new ARM Object Detection processor can process full HD video at 60 frames per second detecting individual objects as small as 50×60 pixels.

It’ll be a little while before you start to see devices with those new chips: ARM will start making the new designs available in mid-2018 and we could start to see hardware in late 2018 or early 2019.


Remember, the forward seeking chip makers out there haven’t necessarily been waiting on ARM to start adding AI and machine learning capabilities to their own ARM-based designs.   Companies including Apple, Qualcomm, Huawei, Imagination, and Rockchip have all introduced processors with dedicated AI and neural networking features inside the past year.    Issue with these was a timid size and a lack of an open standard for all products to have interoperability and a lack of an open, uniform data exchange format.   This has all now arrived in ARM Trillium.

AI processing power has been doubling every six months so far, and this jump to a generic 4.6 trillion operations per second while using less than 2 watts of power means another doubling pf AI horsepower that is taking place now at the generic introduction of Trillium.

Why is this important?   Of the early adopters, Huawei and Rockchip simply recycled last year's chipset with the addition of a small AI module and both of them immediately outperformed the current industry leaders at that point in time for a minimal disruption and a very minimal cash outlay.   But only on certain items and only when using their own proprietary softwares.  The size of these very first efforts was 2.4 TOPS and less.  

Now here comes a canned generic "anybody can use it" 4.6 TOPS setup complete with Optical Character Recognition software and Machine Learning software all in an industry standard data exchange format.

Now let's slowly repeat an important little nugget for complete clarity.

 .....  4.6 trillion operations per second while using less than 2 watts of power  .....  

In terms of raw speed of operations, this is at a ASCI WHITE year 2000 supercomputer level -- but not in total throughput as supercomputers of that era had MULTIPLE THOUSANDS of full sized desktop processors working all at the same time to hit that 4.6 trillion operations per second (cumulative raw speed of operations) because each individual desktop processor was actually quite a lot slower per processor.   And the whole shebang drew a massive 850 kW of electrical power when running at full speed (a small town's worth of power) and also had to have an industrial sized cooling plant to keep the room cool as well.  

http://2.bp.blogspot.com/-Zkj7iVonrk8/UIwXMa-d9mI/AAAAAAAACXQ/bYHyIOKFeH8/s1600/10_ASCII-RED-Supercomputer.gif

But hey, it gives you a gut read on what 4.6 TOPs at less than 2 watts of 5 volt power really means .....

ANY FUNCTION that you can program to run here in the AI ZONE runs 100 times faster than your competition's best and most expensive current product at a battery cost of less than 1/10th as much power draw from the battery.

YES, EXPECT EVERYTHING TO CHANGE UP TO USE AI VERY VERY QUICKLY.    Both Samsung and Micron are now shipping non-volatile 7nm memory products that can run natively at these speeds so memory bottle-necking isn't going to be a show stopper on any properly designed products.

Title: Re: ARM AI and Object Detection modules now availa
Post by Oldfeller on 02/15/18 at 06:51:56


What else is new?

http://https://www.androidcentral.com/sites/androidcentral.com/files/styles/large/public/article_images/2018/02/samsung-dex-pad-render-leak-1.jpg?itok=tUs1dPc

http://https://www.androidcentral.com/sites/androidcentral.com/files/styles/large/public/article_images/2018/02/samsung-dex-pad-render-leak-2.jpg?itok=txF16w-

See the new Samsung S9 DeX dock that allows the screen of your cell phone to be your desktop track pad and your phone to be the rest of your desktop computer system.   All you need to add is a monitor and big keyboard.



Title: Re: ARM AI and Object Detection modules now availa
Post by LANCER on 02/15/18 at 06:59:09

"And there was given to him to give breath to the image of the beast, that the image of the beast might even speak and cause as many as do not worship the image of the beast to be killed."
Rev. 13:15

The time draws near.

Capability doubles every 6 months ?
And the time for that to happen continues to decrease as the builders progress more and more.  


"But when these things begin to take place, straighten up and lift up your heads, because your redemption is drawing near."
Luke 21:28

Title: Re: ARM AI and Object Detection modules now availa
Post by Oldfeller on 02/15/18 at 07:56:45


As the new ARM industry standard comes out at 7nm lithography and at 4.6 TOPS with Samsung and TSMC shipping their very first 7nm full SOC by years end which will also be running the new superfast Samsung and Micron 7nm non-volatile memory -- so where is Intel sitting in all of this?

Still stuck at 14nm and making multiple multiple more big slow 14nm cores to try to make up for being so so so very very far behind technologically.   Starting to promise a few 10nm chipsets, but it is all future promises, no hard reality at this time.

Stock pundits are now asking the important question "Why does Intel stock price still stay so high?"

Intel stock does not have a buy rating any longer, so why do people keep on buying it?

Answer seems to be that American stock buyers have always bought Intel stock .....   it is part of the default mix of every single mixed stock retirement plan.

Even the CEO of Intel has dumped off large blocks of his own personal stock .....   get with it boys and girls, Intel is sinking fast technologically.   It should not be being purchased by rote any more by anybody.   Nor should anyone be automatically buying GE stock in their retirement plan (for the same reasons).

Title: Re: ARM AI and Object Detection modules now availa
Post by justin_o_guy2 on 02/16/18 at 10:34:25

But when these things begin to take place, straighten up and lift up your heads, because your redemption is drawing near."
Luke 21:28

Come quickly..

Title: Re: ARM AI and Object Detection modules now availa
Post by JOEL2014 on 02/18/18 at 08:52:08


081711160B0C3D0D3D05171B50620 wrote:
But when these things begin to take place, straighten up and lift up your heads, because your redemption is drawing near."
Luke 21:28

Come quickly..

Couldn't agree more, brother!

SuzukiSavage.com » Powered by YaBB 2.2!
YaBB © 2000-2007. All Rights Reserved.