Cloud vs Local GPU Hosting (what to use and when?)

Hosting Vs Cloud – Cloud vs Local GPU Hosting (what to use and when?)

Cloud vs Local GPU Hosting (what to use and when?)

Hosting Vs Cloud

How to use cloud GPUs: Channel membership: …

you search:

Hosting Vs Cloud

Cloud vs Local GPU Hosting (what to use and when?)

26 thoughts on “Hosting Vs Cloud – Cloud vs Local GPU Hosting (what to use and when?)

  1. Nice informative video. I'd like to mention one point in favor of cloud, aws to be specific. If you choose a spot block instance, you can define the duration for which you're going to reserve the machine. The duration can be 1 to 6 hrs. These are quite a bit cheaper than on-demand instances. E.g. a g4dn.2xlarge instance when reserved for 6 hrs. costs you about $0.58 per Hour, while the on-demand instance cost is $0.828 per Hour. If you choose to reserve for a lesser duration, costs are even lower. Google colab is another good alternative where you get access to gpu instances for free. I think the notebook can run uninterrupted for 12 hours max, but that's still a great option.

  2. Hi, I am very beginner

    I have an inquiry if I purchase a server to support my main computer

    Do I install programs that require a lot of effort on the server system, or do I install them on my main device

    In other words, can you take advantage of the power of the server processor on the programs on the primary computer?

  3. Excuse me if this has already been asked but, I am curious if you have ever talked with other YouTubers (Linus Tech Tips comes to mind specifically) about creating a collaboration video or testing benchmark system for GPU testing with Neural Networks?

    I feel like there could be tremendous value for individuals or organizations to be able to compare their compute needs with many possible computer specs. Clearly, the more VRAM and Higher clock speeds will perform better. But finding the marginal cost to marginal output could be helpful and very interesting. While your video makes it clear cloud computing is super great and likely to be the best option for most individuals, for people like me who are really excited about Neural Networks & also enjoy playing games on PC/streaming/video editing, it could prove to be quite valuable in terms of choosing our own upgrade path.

  4. I play around as a hobby and I went all in and built an expensive PC with a RTX-2080 ti, as well as a high end gaming laptop with a GTX1060. I barely use it enough to justify the cost of the GPU at all. Should have stuck with Paperspace. 😉 Just the psychological factor of 'paying $2 per hour' on the cloud for something I don't own makes me nervous every time though, and is less convenient than having your own physical machine.

    Just the cost of a gaming GPU these days is a little painful for me. Like you say though, it's useful to have something local just to work on and go to cloud when you need. I just won't upgrade my local GPU in the future, if I need an upgrade, cloud it is.

  5. Great insights! If I am not mistaken, I haven't seen you to mention Google's TPUs for training models. In some presentations, they even mention that it is free. (I don't know though to which degree it is free. 🙂 ) Would be interesting to hear your thoughts on Google's TPUs…

Leave a Reply

Your email address will not be published. Required fields are marked *