少數怕長計,長用cloud貴過自己hardware(已經未貴學校免費電)


https://timdettmers.com/2023/01/30/which-gpu-for-deep-learning/
When is it better to use the cloud vs a dedicated GPU desktop/server?
Rule-of-thumb: If you expect to do deep learning for longer than a year, it is cheaper to get a desktop GPU. Otherwise, cloud instances are preferable unless you have extensive cloud computing skills and want the benefits of scaling the number of GPUs up and down at will.
For the exact point in time when a cloud GPU is more expensive than a desktop depends highly on the service that you are using, and it is best to do a little math on this yourself. Below I do an example calculation for an AWS V100 spot instance with 1x V100 and compare it to the price of a desktop with a single RTX 3090 (similar performance). The desktop with RTX 3090 costs $2,200 (2-GPU barebone + RTX 3090). Additionally, assuming you are in the US, there is an additional $0.12 per kWh for electricity. This compares to $2.14 per hour for the AWS on-demand instance.
At 15% utilization per year, the desktop uses:
(350 W (GPU) + 100 W (CPU))*0.15 (utilization) * 24 hours * 365 days = 591 kWh per year
So 591 kWh of electricity per year, that is an additional $71.
The break-even point for a desktop vs a cloud instance at 15% utilization (you use the cloud instance 15% of time during the day), would be about 300 days ($2,311 vs $2,270):
$2.14/h * 0.15 (utilization) * 24 hours * 300 days = $2,311
So if you expect to run deep learning models after 300 days, it is better to buy a desktop instead of using AWS on-demand instances.