Two very interesting articles. Thank you for that!
Especially the analog processor is a game changer with having the computation directly in memory. Generally, analog computers are a very interesting subject!
Two very interesting articles. Thank you for that!
Especially the analog processor is a game changer with having the computation directly in memory. Generally, analog computers are a very interesting subject!
The PhysX debate was also before my time. But I read into it, and it seems like they solved it partly software based. But with AI we are talking about hardware limitations, especially memory.
Currently, AI operations mean a lot of time-consuming copy tasks between CPU and GPU.
I’d guess that an AI card should have a great value proposition to be worth buying.
Sure the card should have great value or must have an accessible price. It probably also depends on how “heavy” the tasks get. But seeing e.g. OpenAI struggling with requests, it may be useful to decentralize the processing (with running the model locally on the user’s pc).
I’m not sure where you’re going with this, but it feels wrong. I’m not buying a hardware part that cannot function without a constant internet connection or regular payment.
Maybe this statement was a bit confusing. What I meant was, that in a transition phase developers could choose to allow the usage of a dedicated accelerator card to run everything locally and offline. And for people who don’t have or want such a card they could provide a cloud based subscription model, where the processing is done on remote servers.
Should definetely be about superb owls
Calibre allows you to maintain a library of you ebooks and sync it with devices (including Kindle). It is also able to convert the ebooks to all formats you need.
I mostly use it to get ebooks from z-lib and put them on my Kindle. But it works the other way around, too.