Is this correct? I was under the impresion that the most expensive part of an llm is the training, and once that’s done the cost of running a prompt is negligible.
I get your point that this last part doesn’t scale well, but the far larger cost of training must get very diluted if they distribute it across a large user base.
I agree, scaling users isn’t the issue, what is is the neverending chase for the mirage that is AGI. They’ll throw every processing cycle they can muster at that fever dream, that’s the financial black hole.
Yes, but don’t underestimate the power of centralisation.
6 months ago you could set up a server for running a decent local llm for under 800.
By increasing the demands and pushing the price of hardware up, they are efectibly gate keeping access to llms.
I think the plan is that we will need to rely on this companies for compute power and llms services, and then they can do all sorts of nefarious things.
Is this correct? I was under the impresion that the most expensive part of an llm is the training, and once that’s done the cost of running a prompt is negligible.
I get your point that this last part doesn’t scale well, but the far larger cost of training must get very diluted if they distribute it across a large user base.
I agree, scaling users isn’t the issue, what is is the neverending chase for the mirage that is AGI. They’ll throw every processing cycle they can muster at that fever dream, that’s the financial black hole.
Yes, but don’t underestimate the power of centralisation.
6 months ago you could set up a server for running a decent local llm for under 800.
By increasing the demands and pushing the price of hardware up, they are efectibly gate keeping access to llms.
I think the plan is that we will need to rely on this companies for compute power and llms services, and then they can do all sorts of nefarious things.