Can Photonics power next-gen AI chatbots?

What have you asked ChatGPT to do lately? Chances are, if you haven’t tried the AI chatbot yourself, you know someone who has. From writing essays and code to explaining complex concepts, ChatGPT is blowing minds around the world with its speed and human-sounding prose. It’s also another example of how AI is becoming more accessible and pervasive in our smart-everything world.

As compute-intensive applications like AI and machine learning (ML) continue to become more ingrained in our lives, it’s worth considering the underlying infrastructure that makes these innovations possible. Simply put, these applications are demanding a heavy load from the hardware that processes the algorithms, runs the models and keeps data flowing.

Hyperscale data centers with very high-performing compute resources have emerged to tackle the workloads of AI, high-performance computing and big data analytics. However, it is becoming increasingly clear that the traditional copper interconnects that bring together different components inside these data centers are hitting a bandwidth limit. This is where photonic integrated circuits (PICs), which use the power of light, can play a pivotal role.

Photonics can provide an avenue for not only a higher level of performance but also greater energy efficiency. Photonics can also support miniaturization, which can help minimize the footprint of power.