Why don’t softwares exploit GPU capabilities for normal tasks?

When I started to write ML code, I used the CPU and recently I switched my workflow to exploit GPU and CUDA to perform the same task 60 times faster.

I wonder now that GPU usage and acceptance is relatively widespread, why don't most general software programs like excel, chrome, etc exploit GPU capabilities for better performances rather than pushing for faster CPUs and higher clock speeds?

I know that GPU ecosystem is not as mature as CPU is but a switch to the latter should be the next step.

Thoughts?

submitted by /u/card_chase
[link] [comments]

from Software Development – methodologies, techniques, and tools. Covering Agile, RUP, Waterfall + more! https://ift.tt/3GBUPjv

Leave a comment

Design a site like this with WordPress.com
Get started
search previous next tag category expand menu location phone mail time cart zoom edit close