Partner im RedaktionsNetzwerk Deutschland
PodcastsTechnologieThe MAD Podcast with Matt Turck

The MAD Podcast with Matt Turck

Matt Turck
The MAD Podcast with Matt Turck
Neueste Episode

Verfügbare Folgen

5 von 85
  • Inside the Paper That Changed AI Forever - Cohere CEO Aidan Gomez on 2025 Agents
    What really happened inside Google Brain when the “Attention is All You Need” paper was born? In this episode, Aidan Gomez — one of the eight co-authors of the Transformers paper and now CEO of Cohere — reveals the behind-the-scenes story of how a cold email and a lucky administrative mistake landed him at the center of the AI revolution.Aidan shares how a group of researchers, given total academic freedom, accidentally stumbled into one of the most important breakthroughs in AI history — and why the architecture they created still powers everything from ChatGPT to Google Search today.We dig into why synthetic data is now the secret sauce behind the world’s best AI models, and how Cohere is using it to build enterprise AI that’s more secure, private, and customizable than anything else on the market. Aidan explains why he’s not interested in “building God” or chasing AGI hype, and why he believes the real impact of AI will be in making work more productive, not replacing humans.You’ll also get a candid look at the realities of building an AI company for the enterprise: from deploying models on-prem and air-gapped for banks and telecoms, to the surprising demand for multimodal and multilingual AI in Japan and Korea, to the practical challenges of helping customers identify and execute on hundreds of use cases.CohereWebsite - https://cohere.comX/Twitter - https://x.com/cohereAidan GomezLinkedIn - https://ca.linkedin.com/in/aidangomezX/Twitter - https://x.com/aidangomezFIRSTMARKWebsite - https://firstmark.comX/Twitter - https://twitter.com/FirstMarkCapMatt Turck (Managing Director)LinkedIn - https://www.linkedin.com/in/turck/X/Twitter - https://twitter.com/mattturck(00:00) Intro (02:00) The Story Behind the Transformers Paper (03:09) How a Cold Email Landed Aidan at Google Brain (10:39) The Initial Reception to the Transformers Breakthrough (11:13) Google’s Response to the Transformer Architecture (12:16) The Staying Power of Transformers in AI (13:55) Emerging Alternatives to Transformer Architectures (15:45) The Significance of Reasoning in Modern AI (18:09) The Untapped Potential of Reasoning Models (24:04) Aidan’s Path After the Transformers Paper and the Founding of Cohere (25:16) Choosing Enterprise AI Over AGI Labs (26:55) Aidan’s Perspective on AGI and Superintelligence (28:37) The Trajectory Toward Human-Level AI (30:58) Transitioning from Researcher to CEO (33:27) Cohere’s Product and Platform Architecture (37:16) The Role of Synthetic Data in AI (39:32) Custom vs. General AI Models at Cohere (42:23) The AYA Models and Cohere Labs Explained (44:11) Enterprise Demand for Multimodal AI (49:20) On-Prem vs. Cloud (50:31) Cohere’s North Platform (54:25) How Enterprises Identify and Implement AI Use Cases (57:49) The Competitive Edge of Early AI Adoption (01:00:08) Aidan’s Concerns About AI and Society (01:01:30) Cohere’s Vision for Success in the Next 3–5 Years
    --------  
    1:02:24
  • AI That Ends Busy Work — Hebbia CEO on “Agent Employees”
    What if the smartest people in finance and law never had to do “stupid tasks” again? In this episode, we sit down with George Sivulka, founder of Hebbia, the AI company quietly powering 50% of the world’s largest asset managers and some of the fastest-growing law firms. George reveals how Hebbia’s Matrix platform is automating the equivalent of 50,000 years of human reading — every year — and why the future of work is hybrid teams of humans and AI “agent employees.” You’ll get the inside story on how Hebbia went from a stealth project at Stanford to a multinational company trusted by the Department of Defense, and why their spreadsheet-inspired interface is leaving chatbots in the dust. George breaks down the technical secrets behind Hebbia’s ISD architecture (and why they killed RAG), how they process billions of pages with near-zero hallucinations, and what it really takes to sell AI into the world’s most regulated industries.We also dive into the future of organizational design, why generalization beats specialization in AI, and how “prompting is the new management skill.” Plus: the real story behind AI hallucinations, the myth of job loss, and why naiveté might be the ultimate founder superpower.HebbiaWebsite - https://www.hebbia.comTwitter - https://x.com/HebbiaAIGeorge SivulkaLinkedIn - https://www.linkedin.com/in/sivulkaTwitter - https://x.com/gsivulkaFIRSTMARKWebsite - https://firstmark.comTwitter - https://twitter.com/FirstMarkCapMatt Turck (Managing Director)LinkedIn - https://www.linkedin.com/in/turck/Twitter - https://twitter.com/mattturck(00:00) Intro (01:46) What is Hebbia (02:49) Evolving Hebbia’s mission (04:45) The founding story and Stanford's inspiration (09:45) The rise of agent employees and AI in organizations (12:36) The future of AI-powered work (15:17) AI research trends (19:49) Inside Matrix: Hebbia’s flagship AI platform (24:02) Why Hebbia isn’t just another chatbot (28:27) Moving beyond RAG: Hebbia’s unique architecture (34:10) Tackling hallucinations in high-stakes AI (35:59) Research culture and avoiding industry groupthink (39:40) Innovating go-to-market and enterprise sales (41:57) Real-world value: Cost savings and new revenue (43:49) How AI is changing junior roles (45:55) Leadership and perspective as a young founder (47:16) Hebbia’s roadmap: Success in the next 3 years
    --------  
    48:24
  • AI Eats the World: Benedict Evans on What Really Matters Now
    What if the “AI revolution” is actually… stuck in the messy middle? In this episode, Benedict Evans returns to tackle the big question we left hanging a year ago: Is AI a true paradigm shift, or just another tech platform shift like mobile or cloud? One year later, the answer is more complicated — and more revealing — than anyone expected.Benedict pulls back the curtain on why, despite all the hype and model upgrades, the core LLMs are starting to look like commodities. We dig into the real battlegrounds: distribution, brand, and the race to build sticky applications. Why is ChatGPT still topping the App Store charts while Perplexity and Claude barely register outside Silicon Valley? Why did OpenAI just hire a CEO of Applications, and what does that signal about the future of AI products?We go deep on the “probabilistic” nature of LLMs, why error rates are still the elephant in the room, the future of consumer AI (is there a killer app beyond chatbots and image generators?), the impact of generative content on e-commerce and advertising, and whether “AI agents” are the next big thing — or just another overhyped demo.And, we ask: What happened to AI doomerism? Why did the existential risk debate suddenly vanish, and what risks should we actually care about?Benedict EvansLinkedIn - https://www.linkedin.com/in/benedictevansThreads - https://www.threads.net/@benedictevansFIRSTMARKWebsite - https://firstmark.comX/Twitter - https://twitter.com/FirstMarkCapMatt Turck (Managing Director)LinkedIn - https://www.linkedin.com/in/turck/X/Twitter - https://twitter.com/mattturck(00:00) Intro (01:47) Is AI a Platform Shift or a Paradigm Shift? (07:21) Error Rates and Trust in AI (15:07) Adapting to AI’s Capabilities (19:18) Generational Shifts in AI Usage (22:10) The Commoditization of AI Models (27:02) Are Brand and Distribution the Real Moats in AI? (29:38) OpenAI: Research Lab or Application Company? (33:26) Big Tech’s AI Strategies: Apple, Google, Meta, AWS (39:00) AI and Search: Is ChatGPT a Search Engine? (42:41) Consumer AI Apps: Where’s the Breakout? (45:51) The Need for a GUI for AI (48:38) Generative AI in Social and Content (51:02) The Business Model of AI: Ads, Memory, and Moats (55:26) Enterprise AI: SaaS, Pilots, and Adoption (01:00:08) The Future of AI in Business (01:05:11) Infinite Content, Infinite SKUs: AI and E-commerce (01:09:42) Doomerism, Risks, and the Future of AI
    --------  
    1:15:09
  • Jeremy Howard on Building 5,000 AI Products with 14 People (Answer AI Deep-Dive)
    What happens when you try to build the “General Electric of AI” with just 14 people? In this episode, Jeremy Howard reveals the radical inside story of Answer AI — a new kind of AI R&D lab that’s not chasing AGI, but instead aims to ship thousands of real-world products, all while staying tiny, open, and mission-driven.Jeremy shares how open-source models like DeepSeek and Qwen are quietly outpacing closed-source giants, why the best new AI is coming out of China. You’ll hear the surprising truth about the so-called “DeepSeek moment,” why efficiency and cost are the real battlegrounds in AI, and how Answer AI’s “dialogue engineering” approach is already changing lives—sometimes literally.We go deep on the tools and systems powering Answer AI’s insane product velocity, including Solve It (the platform that’s helped users land jobs and launch startups), Shell Sage (AI in your terminal), and Fast HTML (a new way to build web apps in pure Python). Jeremy also opens up about his unconventional path from philosophy major and computer game enthusiast to world-class AI scientist, and why he believes the future belongs to small, nimble teams who build for societal benefit, not just profit.Fast.aiWebsite - https://www.fast.aiX/Twitter - https://twitter.com/fastdotaiAnswer.aiWebsite - https://www.answer.ai/X/Twitter - https://x.com/answerdotaiJeremy HowardLinkedIn - https://linkedin.com/in/howardjeremyX/Twitter - https://x.com/jeremyphowardFIRSTMARKWebsite - https://firstmark.comX/Twitter - https://twitter.com/FirstMarkCapMatt Turck (Managing Director)LinkedIn - https://www.linkedin.com/in/turck/X/Twitter - https://twitter.com/mattturck(00:00) Intro (01:39) Highlights and takeaways from ICLR Singapore (02:39) Current state of open-source AI (03:45) Thoughts on Microsoft Phi and open source moves (05:41) Responding to OpenAI’s open source announcements (06:29) The real impact of the Deepseek ‘moment’ (09:02) Progress and promise in test-time compute (10:53) Where we really stand on AGI and ASI (15:05) Jeremy’s journey from philosophy to AI (20:07) Becoming a Kaggle champion and starting Fast.ai (23:04) Answer.ai mission and unique vision (28:15) Answer.ai’s business model and early monetization (29:33) How a small team at Answer.ai ships so fast (30:25) Why Devin AI agent isn't that great (33:10) The future of autonomous agents in AI development (34:43) Dialogue Engineering and Solve It (43:54) How Answer.ai decides which projects to build (49:47) Future of Answer.ai: staying small while scaling impact
    --------  
    55:02
  • Why Influx Rebuilt Its Database for the IoT and Robotics Explosion
    InfluxDB just dropped its biggest update ever — InfluxDB 3.0 — and in this episode, we go deep with the team behind the world’s most popular open-source time series database. You’ll hear the inside story of how InfluxDB grew from 3,000 users in 2015 to over 1.3 million today, and why the company decided to rewrite its entire architecture from scratch in Rust, ditching Go and moving to object storage on S3.We break down the real technical challenges that forced this radical shift: the “cardinality problem” that choked performance, the pain of linking compute and storage, and why their custom query language (Flux) failed to catch on, leading to a humbling embrace of SQL as the industry standard. You’ll learn how InfluxDB is positioning itself in a world dominated by Databricks and Snowflake, and the hard lessons learned about monetization when 1.3 million users only yield 2,600 paying customers.InfluxDataWebsite - https://www.influxdata.comX/Twitter - https://twitter.com/InfluxDBEvan KaplanLinkedIn - https://www.linkedin.com/in/kaplanevanX/Twitter - https://x.com/evankaplanFIRSTMARKWebsite - https://firstmark.comX/Twitter - https://twitter.com/FirstMarkCapMatt Turck (Managing Director)LinkedIn - https://www.linkedin.com/in/turck/X/Twitter - https://twitter.com/mattturckFoursquare: Website - https://foursquare.comX/Twitter - https://x.com/Foursquare IG - instagram.com/foursquare (00:00) Intro (02:22) The InfluxDB origin story and why time series matters (06:59) The cardinality crisis and why Influx rebuilt in Rust (09:26) Why SQL won (and Flux lost) (16:34) Why UnfluxData bets on FDAP (22:51) IoT, Tesla Powerwalls, and real-time control systems (27:54) Competing with Databricks, Snowflake, and the “lakehouse” world (31:50) Open Source lessons, monetization, & what’s next
    --------  
    35:35

Weitere Technologie Podcasts

Über The MAD Podcast with Matt Turck

The MAD Podcast with Matt Turck, is a series of conversations with leaders from across the Machine Learning, AI, & Data landscape hosted by leading AI & data investor and Partner at FirstMark Capital, Matt Turck.
Podcast-Website

Hören Sie The MAD Podcast with Matt Turck, Ö1 matrix und viele andere Podcasts aus aller Welt mit der radio.at-App

Hol dir die kostenlose radio.at App

  • Sender und Podcasts favorisieren
  • Streamen via Wifi oder Bluetooth
  • Unterstützt Carplay & Android Auto
  • viele weitere App Funktionen
Rechtliches
Social
v7.18.3 | © 2007-2025 radio.de GmbH
Generated: 6/8/2025 - 10:36:12 PM