Build or buy? Scaling your enterprise gen AI pipeline in 2025


This article is part of VentureBeat's special issue, “AI at Scale: From Vision to Viability.” Read more from this special issue here.

This article is part of VentureBeat's special issue, “AI at Scale: From Vision to Viability.” Read more from the magazine here.

Increased acceptance generation devices balancing ambition with practicality has always been a challenge, and in 2025, the stakes are higher than ever. Enterprises racing to adopt large language models (LLMs) are facing a new reality: scaling isn't just about using bigger models or investing in modern tools – it's it's about integrating AI in ways that transform operations, empower teams and maximize costs. Success depends on more than technology; a cultural and operational shift that aligns AI capabilities with business goals is required.

The scaling imperative: Why 2025 is different

As next-generation AI evolves from experiments to enterprise-scale deployment, businesses are facing an inflection point. The excitement of early adoption has given way to the practical challenges of maintaining efficiency, managing costs and ensuring relevance in competitive markets. Scaling AI in 2025 is about answering tough questions: How can businesses leverage generative tools across sectors? What infrastructure will support the growth of AI without depleting resources? And perhaps most importantly, how do teams adapt to AI-driven workflows?

Success depends on three essential principles: identifying clear, high-value use cases; maintaining technological flexibility; and fostering employees who are equipped to adapt. Successful enterprises don't just embrace gen AI – they craft strategies that align the technology with business needs, continuously re-evaluate costs, performance and the cultural shifts necessary for lasting impact. This approach is not just about using modern tools; it's about building operational resilience and scalability in an environment where technology and markets are changing at a rapid pace.

Companies like Wayfair and Expedia include these lessons, demonstrating how hybrid approaches to LLM adoption can transform operations. By combining external platforms with specialized solutions, these businesses demonstrate the power to balance flexibility with precision, setting a model for others.

Combining customization with flexibility

The decision to build or buy gen AI tools is often portrayed as bold, but Wayfair and Expedia are showing the benefits of a forward-thinking strategy. Fiona Tan, Wayfair's CTO, emphasizes the value of balancing flexibility with specificity. Wayfair uses Google Vertex AI for general applications while developing proprietary tools for specific requirements. Tan shared the company's iterative approach, sharing how smaller, cost-effective models often outperform larger, more expensive options in tagging attributes products such as clothing and furniture dyes.

Similarly, Expedia uses a multi-vendor LLM proxy layer that allows for seamless integration of different modules. Rajesh Naidu, Expedia's senior vice president, describes their strategy as a way to stay flexible while optimizing costs. “We are always fair, looking at the best of the breed (models) where it makes sense, but we are also willing to build for our own range,” Naidu explains. This flexibility a ' ensuring that the team can adapt to growing business needs without being locked into one vendor.

Such hybrid approaches are reminiscent of the evolution of enterprise resource planning (ERP) in the 1990s, when enterprises had to decide between adopting rigid, out-of-the-box solutions and heavily customizing systems to fit their workflows. Then, as now, the companies that succeeded recognized the value of combining external tools with specific developments to address specific operational challenges.

Operational efficiency for key business functions

Both Wayfair and Expedia demonstrate that the real power of LLMs is in targeted applications that deliver measurable impact. Wayfair uses generative AI to enhance its product catalog, enhancing metadata with autonomous accuracy. This not only streamlines workflows but improves search and customer recommendations. Tan highlights another transformative application: leveraging LLMs to analyze outdated database structures. With original system designers no longer available, gen AI enables Wayfair to mitigate technical debt and find new efficiencies in legacy systems.

Expedia has succeeded in integrating gen AI across customer service and developer workflows. Naidu shares that a custom gen AI tool designed for call summarization ensures that “90% of passengers get to an agent within 30 seconds,” adding to the development great in customer satisfaction. In addition, GitHub Copilot was used throughout the enterprise, speeding up code generation and debugging. These operational benefits underscore the importance of aligning gen AI capabilities with clear, high-value business use cases.

The role of hardware in gen AI

The hardware issues of scaling LLMs are often overlooked, but play a critical role in long-term sustainability. Both Wayfair and Expedia currently rely on cloud infrastructure to manage their gen AI workloads. Tan notes that Wayfair continues to evaluate the scalability of cloud providers like Google, while keeping an eye on the potential need for local infrastructure to handle real-time applications more efficiently.

Expedia's approach also emphasizes flexibility. Held especially on AWSthe company uses a proxy layer to dynamically route tasks to the most appropriate computing environment. This system balances performance with cost-effectiveness, ensuring that decision-making costs do not spiral out of control. Naidu highlights the importance of this flexibility as enterprise gen AI applications become more complex and demand higher processing power.

This focus on infrastructure reflects broader trends in enterprise computing, reminiscent of the shift from monolithic data centers to microservices architectures. As companies like Wayfair and Expedia scale their LLM capabilities, they demonstrate the importance of balancing cloud scalability with emerging options such as edge computing and custom chips.

Training, management and change management

Using LLMs is not just a technological challenge – it's a cultural challenge. Both Wayfair and Expedia emphasize the importance of fostering organizational readiness to adopt and integrate gen AI tools. At Wayfair, comprehensive training ensures that employees across departments can adapt to new workflows, especially in areas such as customer service, where AI-generated responses require a human eye to be match the voice and tone of the company.

Expedia has taken governance a step further by establishing an accountable AI Council to oversee all key gen AI decisions. This advice ensures that practice is in line with ethical guidelines and business objectives, fostering trust across the organisation. Naidu emphasizes the importance of rethinking metrics to measure the effectiveness of gen AI. Traditional KPIs often fall short, forcing Expedia to embrace precise metrics and memory that better align with business goals.

These cultural changes are critical to the long-term success of gen AI in enterprise settings. Technology alone cannot drive transformation; transformation requires a workforce equipped to leverage gen AI capabilities and a governance structure that ensures accountable implementation.

Lessons for success download

The experiences of Wayfair and Expedia offer valuable lessons for any organization looking to effectively scale LLMs. Both companies point out that success depends on identifying clear business use cases, maintaining flexibility in technology choices, and fostering a culture of change. Their hybrid approaches provide a model for balancing innovation with efficiency, ensuring gen AI investments deliver tangible results.

What makes scaling AI in 2025 an unprecedented challenge is the rapidity of technological and cultural change. The hybrid strategies, flexible infrastructures and strong data cultures that define successful AI deployments today will lay the foundation for the next wave of innovation. Enterprises that build these foundations now will not just scale AI; they will increase resilience, flexibility and competitive advantage.

Looking forward, the challenges of decision-making costs, real-time capabilities and evolving infrastructure needs will continue to shape the gen AI enterprise landscape. As Naidu aptly says, “Gen AI and LLMs are going to be a long-term investment for us and it has differentiated us in the travel space. We need to be aware that this will require some conscious investment prioritization and understanding of use cases.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *