What did we cover?
We explored how next-generation AI infrastructure is shaping the future of reinforcement learning, large-scale inference, and world models.
Who spoke?
-
Paul Chang (DataCrunch): Training World Models Using B200s
-
Erik Schultheis (IST Austria): Training Quantized LLMs Efficiently on Consumer GPUs
Any questions, thoughts, or takeaways from the session?
Drop them below
we’d love to hear your perspective and keep the discussion going.