AI operates on two paths that will define data center capacity demands in the coming years: inference, where models deliver predictions and answers in real time; and training, where models learn tasks by processing large amounts of data. The bandwidth and low latency required by these phases will demand a combination of AI-Ready Data Centers (or also called AI Data Centers -AIDC-) and a back-end infrastructure that interconnects them, plus high-performance cloud exchanges.