『Under The Banyan Tree By Sam Awrabi』のカバーアート

Under The Banyan Tree By Sam Awrabi

Under The Banyan Tree By Sam Awrabi

著者: Sam Awrabi
無料で聴く

このコンテンツについて

The AI-Native is the podcast hosted by Sam Awrabi, Founder & Solo General Partner at Banyan Ventures—an AI-native venture fund with nearly $17M AUM and a track record of backing early-stage companies that have gone on to generate over $490M in revenue from pre-seed and seed. Sam draws on deep, first-hand AI infrastructure experience and thousands of founder relationships to explore how the next generation of AI-native products are being built, sold, and scaled. Each episode dives into the tools, mindsets, and systems driving the most important technological revolution of our lifetime.

© 2025 Under The Banyan Tree By Sam Awrabi
エピソード
  • How This YC-Backed CEO Is Reinventing Data Centers With AI—Drawing on Her Microsoft Experience
    2025/08/14

    What happens when data centers built for the cloud era try to handle AI workloads? They break down—frequently and expensively. Margarita is CEO & Co-Founder at Aravolta. She is a trailblazing entrepreneur building a cutting-edge data center management software company. With a solid background in engineering and extensive experience at Microsoft working on Stargate with OpenAI.


    Data center operations have remained largely unchanged for 15-20 years, creating a massive efficiency gap just as computational demands are skyrocketing. After building internal tools at Microsoft for supercomputer deployments like OpenAI, Margarita recognized the opportunity to transform the entire industry. She discovered that 98% of data center operators were deeply unhappy with their existing software systems but had few alternatives available.

    The technical challenge is immense—connecting to hundreds of different device types across dozens of vendors, each with their own proprietary communication protocols. Aerovolta's breakthrough came through developing a middleware layer that can rapidly integrate with virtually any equipment, from transformers to cooling systems to the GPUs themselves. This foundation of comprehensive data collection enables their AI to analyze thousands of inputs simultaneously, identifying inefficiencies humans simply cannot see.

    The results speak for themselves: immediate 2-5% power savings upon deployment, 15% increases in uptime, and significantly extended hardware lifespans. This last point is particularly critical as some data centers are burning through million-dollar GPU clusters in just one year instead of their expected 7-8 year lifespan due to workload spikes and suboptimal operations.

    Throughout our conversation, Margarita shares insights from her founder journey—from immigrant parents who taught extreme gratitude and self-reliance, to navigating the Y Combinator experience, to building a company that's now working with the most sophisticated compute providers in the world. Her approach to fundraising, hiring, and product development offers valuable lessons for anyone building in the AI infrastructure space.

    Ready to see how intelligent infrastructure management could transform your data center operations? Listen now to discover how small, focused teams are solving massive challenges that once seemed insurmountable.

    続きを読む 一部表示
    1 時間 34 分
  • Breaking the AI Energy Barrier: The Future of Logarithmic Math in Chips with Gilles Backhus
    2025/08/07

    Gilles Backhus is the Co-Founder and VP of AI at Recogni, an innovative company focusing on creating AI inference chips. Gilles has a vast background in the AI industry, focusing on making AI computation more efficient through technological advancements like logarithmic math. His work at Recogni places the company at the forefront of AI hardware development outside of Nvidia's ecosystem. Gilles brings years of research and experience in the field, aiming to revolutionize the energy efficiency and capability of AI chips.

    Chapters:
    0:00 Revolutionizing AI Compute with Energy-Efficient Logarithmic Math
    2:45 The Future of AI-Generated Content and Its Impact
    7:22 AI's Impact on Company Structures and Investment Standards
    14:42 AI Twins and the Future of Remote Work
    18:10 The Cultural Impact of AI and the Decline of Truth
    25:13 Innovations and Challenges in AI Chip Technology
    37:14 Logarithmic Math Revolutionizing AI Efficiency and Model Parameters
    39:38 Understanding FPGA and ASIC Chip Design in AI Infrastructure
    43:11 Nvidia's Dominance in the GPU Market and Ecosystem Control
    46:26 Nvidia's Expanding Influence and Potential Monopoly Concerns
    49:32 Specialized Inference Chips Versus Nvidia's Versatile Hardware
    51:12 The Challenges and Future of AI Model Efficiency
    54:56 The Fascinating World of Software-Defined Data Centers

    Episode Summary:

    In this compelling episode of the AI Native Podcast, host Sam Awrabi sits down with Gilles Backhus, co-founder of Recogni, to delve into the transformative impact of AI on various facets of technology and industry. They discuss how AI's rapid growth is reshaping domains like AI inference and generative AI, highlighting the massive potential for expansion, especially in areas that venture beyond traditional AI modalities like video. Gilles shares his insights on the future of AI in generating content, explaining how emerging technologies could significantly alter everyday internet interactions, such as personalized video content creation.

    Through an in-depth conversation, Gilles and Sam explore the intricacies of AI infrastructure, focusing on innovations in AI chip development. Gilles offers a detailed look at Recogni's groundbreaking work in logarithmic math, which allows for efficient AI computation by minimizing costly mathematical operations. As investors and technologists navigate this rapidly evolving ecosystem, Gilles shares his thoughts on the potential disruptions and novel opportunities that lie ahead in the AI industry.

    Key Takeaways:

    • AI is rapidly evolving, with video content creation expected to be a significant driver of future growth.
    • Recogni's innovation in logarithmic math addresses AI's energy consumption, allowing more efficient calculations by reducing reliance on multiplications.
    • The AI inference market is vast and will continue to grow, driven by advances in hardware innovation beyond Nvidia's ecosystem.
    • The complexity of data centers, coupled with AI infrastructure demands, emphasizes the need for optimized AI compute to sustain the advancing AI industry.
    • Companies today must adapt to the rising bar set by AI innovations, ensuring efficiency and capability remain at the forefront of technological breakthroughs.

    Notable Quotes:

    1. "We found a way to basically eliminate that. A few other companies, big companies, have realized also by now that this is the way to go. We're the first company to implement logarithmic math."
    2. "I think over the next few years it's just going to continue at a very aggressive growth that you could somewhat model."
    3. "You can put it in numbers, but it's really unimaginable."
    4. "AI is this new tool now that allows you to be extremely cost-effective and also a little bit faster even for more or less established startups."
    続きを読む 一部表示
    59 分
まだレビューはありません