It costs a lot to build an AI company, which is why the most competitive ones are either existing tech giants with an abundance of cash to burn or start-ups that have raised billions of dollars largely from existing tech giants with an abundance of cash to burn. A product like ChatGPT was unusually expensive to build for two main reasons. One is constructing the model, a large language model, a process in which patterns and relationships are extracted from enormous amounts of data using massive clusters of processors and a lot of electricity. This is called training. The other is actively providing the service, allowing users to interact with the trained model, which also relies on access to or ownership of a lot of powerful computing hardware. This is called inference.
Read Full Article »