![]() But in other domains, such as the automotive industry, where there’s expectation for higher durability, it can become problematic, especially as GPUs can die out faster due to the exposure to environmental factors and more intense usage. In general GPUs last around 2-5 years, which isn’t a major issue for gamers who usually replace their computers every few years. This means more power, electricity, maintenance costs, etc.” “GPUs are at the technology limit of transistors, causing them to run at high temperatures and require significant cooling, which is not always possible. “Some critical applications like smart cities’ video surveillance require hardware to be exposed to environmental factors (like the sun, for example) that negatively impact GPUs,” Larzul says. But many of the environments where deep learning models are deployed are not friendly to GPUs, such as self-driving cars, factories, robotics, and many smart-city settings where the hardware has to endure environmental factors such as heat, dust, humidity, motion, and electricity constraints. This is not much of a problem as you’re training your neural network on a desktop workstation, a laptop computer, or a server rack. GPUs require a lot of electricity, produce a lot of heat, and use fans for cooling. “Neural network training is typically conducted in an environment that is not comprehensive of the varying constraints that the system running the neural network will experience in deployment – this can put a strain on the real-world use of GPUs,” Larzul says. Graphics processors can cut down the time of training neural networks from days and weeks to hours and minutes.Īside from the rising stock of graphics hardware companies, the appeal of GPUs in deep learning has given rise to a host of public cloud services that offer virtual machines with strong GPUs for deep learning projects.īut graphic cards also have hardware and environmental limitations. Graphics cards can perform matrix multiplications in parallel, which speeds up operations tremendously. Three-dimensional graphics, the original reason GPUs are packed with so much memory and computing power, have one thing in common with deep neural networks: They require massive amounts of matrix multiplications. The challenges of using GPUs for deep learning But if Mipsology succeeds in its mission, it could be beneficial to many AI developers where GPUs are currently struggling. Specialized AI hardware has become an industry of its own and the jury is still out on what will be the best infrastructure for deep learning algorithms. FPGAs, however, are very hard to program, a problem that Larzul hopes to solve with a new platform his company has developed. ![]() FPGA is a type of processor that can be customized after manufacturing, which makes it more efficient than generic processors. The solution, Larzul says, are field programmable gate arrays (FPGA), an area where his company specializes. Nvidia, in fact, has even pivoted from a pure GPU and gaming company to a provider of cloud GPU services and a competent AI research lab.īut GPUs also have inherent flaws that pose challenges in putting them to use in AI applications, according to Ludovic Larzul, CEO and co-founder of Mipsology, a company that specializes in machine learning software. Companies like Nvidia and AMD have seen a huge boost to their stock prices as their GPUs have proven to be very efficient for training and running deep learning models. The renewed interest in artificial intelligence in the past decade has been a boon for the graphics cards industry. ![]() Field programmable gate arrays (FPGA) solve many of the problems GPUs face in running deep learning models ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |