Last month, Ensemble’s CIO sat down for a longform interview on the Talking Billions podcast. Host Bogumil Baranowski used the conversation to dig deep into the Intrinsic Investing archives to get Sean to elaborate on many of the themes explored on the blog. In the interview, Sean and Bogumil discussed:

  • How Sean’s sociology and psychology minded parents influenced how he thinks about investing.
  • The linkages between the first blog post on Intrinsic Investing in early 2016 to the posts published this year that explored the Charlie Munger quote, “Over the long term, it’s hard for a stock to earn a much better return than the business which underlies it earns… even if you originally buy it at a huge discount.”
  • Why the elevated returns earned by equity investors vs bond investors comes in exchange not for volatility, as academic work suggests, but rather from the anxiety which equity investors must manage.
  • The concept of equity duration and the lessons we can learn from the 1970s on how inflation, duration, and growth impact the path of equity returns.
  • The role of empathy in investing and why it is a critical element of expanding an investor’s circle of competence.

Ensemble’s CIO Sean Stannard-Stockton recently appeared on Compounders, a video podcast with host Ben Claremon. In the interview, Sean talked about;

  • Why some businesses create and build value, while others do so to a much more limited degree, and why Warren Buffett and other “value investors” pivoted to investing in value creative businesses.
  • Why Ensemble centers competitive advantage analysis in our investment process.
  • Why offering wealth management services allows us to pursue long term investments.
  • Why we blog, appear on podcasts and are active on social media.
  • Why understanding how a company creates (or extracts) value from its stakeholders is such an important question for all investors who seek high returns for shareholders.

During our second quarter portfolio update, we profiled portfolio holding, ServiceNow, Inc. (NOW). Below is a replay of our live commentary on the company from our quarterly portfolio update WEBINAR and an excerpt from our QUARTERLY LETTER.

ServiceNow (NOW): AI has been all over the news lately and we’ve seen some amazing results in terms of a step up in intelligence, productivity, and creativity come out of it. Nvidia, the semiconductor leader in GPUs (Graphical Processor Units), has gotten the spotlight as the hardware enabler of the technology while Microsoft, Open AI, and Google have been highlighted as the companies that brought the technology to the market.

However, the ramp in the use and application of the technology is just starting. As we’ve seen with previous innovations, the majority of the value often accrues more broadly to companies and societies who incorporate the technology into their offerings – creating new applications and enhancing existing ones – and see higher productivity and better living standards. Of course, there are also companies that get disrupted and have to figure out new business models or become obsolete.

We can think of ServiceNow’s core offering, the NOW Platform, as a “Platform of Platforms” within the enterprise, and as we’ll explain, it allows companies to stitch together their disparate, siloed software and data systems so that they can be accessible, modernized, and integrated to create more efficient workflows to unlock better customer services and increase productivity.

Adding Generative AI capabilities on top of ServiceNow’s core platform and applications only serves to further enhance its value proposition because it serves as one of the key complimentary tools large enterprises will need to take full advantage of Generative AI’s capabilities and benefits.

If we think about the nature of the IT systems in a large company, it’s likely to look like a spaghetti mess of systems that have an alphabet soup of acronyms like ERP, SCM, CRM, HCM, etc. and these systems have been implemented and incorporated into the way enterprises run their businesses in a piecemeal way over 40 years or more. Often times each system is managed by teams of people operating separately from one another with siloed pools of data.

We’ve all experienced this jumble of software at work and even at home. You have a separate login into each one and you switch screens to get the data and functionality of each one. You download spreadsheets of data from one system to upload into another system or even manually entering data from one system to another to keep them synched.

Enterprises often tried to manually link systems so that they could have workflows built across different systems that needed to interact with each other, but these manually built links are often fragile and unscalable and expensive to fix and maintain. The whole hodge podge of systems and custom connections was not upgradeable to the current instant access, instant gratification world that the mobile internet has brought to us all as customers and employees.

The dream was to have a system that sat on top of the underlying transaction systems and seamlessly integrated the software applications and the data they had within them. That more elegant and scalable solution was called an “Enterprise Service Bus” that would be able to connect these disparate systems and access and translate data between systems that had different terminologies and fields and function calls. Various companies attempted to do this, in the early 2000s but it was relatively kludgy, expensive, and unscalable. They sold the dream but didn’t really execute it because their technological architecture was not quite up to the job.

Enter ServiceNow… which leveraged the cloud-based software architecture to build a scalable integration platform, called the NOW Platform, at a time when cloud-based software systems were starting to take off and enterprise had standardized and implemented a few key systems like an SAP, Oracle, Salesforce, Adobe, Workday and Microsoft to name a few.

ServiceNow was able to create an integrated software model in the cloud that connected to these systems, which still didn’t talk to each other, yet had core data models that were known and standardized. By building an integration platform in the cloud, ServiceNow was able to become the enterprise service bus akin to a central nervous system for the enterprise, connecting to the key functional systems that enabled the enterprise to operate its daily business. This central platform could communicate between systems and access the siloed data in these systems in a scalable, standardized way that was both elegant and cost efficient.

That’s a really nice thing to be able to do technically, but in order to drive value from this capability, ServiceNow built applications on top of the core NOW platform. By creating a single data integration model with underlying commonly used functionality built in, the NOW platform allows ServiceNow’s R&D team to quickly create these applications that leverage the underlying technology. These applications are quick to build for its internal R&D team resulting in high ROI (return on investment) for the company and quick to deploy and fast time to value for the customer which also delivers to them high ROI.

These applications are tools that dramatically improve IT management of enterprise systems and devices, bring greater employee efficiency through lower cross system friction, and improve customer service through integrated systems visibility and automation. If you think about what applications are, they are basically collections of scripted workflows that enable a certain end functionality that the user is looking for to fulfill tasks or goals.

Think of a new employee joining a firm… HR has to add the employee to the payroll system, IT has to allocate laptops, software licenses, and other equipment, facilities has to allocate a space for the employee to work, legal and compliance has to vet and train the employee, etc. These are all different systems and teams of people that touch the new hire workflow, than at least one HR person has to coordinate and access. With an integrated workflow platform like ServiceNow, you no longer need a person to coordinate the steps and people, these can all be automated and reduce hours of time across the enterprise to get the job of onboarding done. You can think of a similar workflow for a customer service issue where the internet service is down, or logistical deployment and tracking of COVID 19 vaccines from the Federal level to the local level, which was a real problem ServiceNow helped solve in weeks not months by leveraging its core system.

This makes ServiceNow uniquely able to serve as the cross-systems workflow platform via its integration hooks into major systems used at companies. And the importance of these workflows in the daily operations and interactions of employees and customers in companies makes it very sticky and its value very visible to key decision makers. And the underlying capabilities of the NOW platform, once implemented in the enterprise, create opportunities to build new applications that couldn’t have been possible without it.

We can see how successfully ServiceNow has been able to leverage that[…]

During our second quarter portfolio update, we profiled portfolio holding, Alphabet, Inc. (GOOGL). Below is a replay of our live commentary on the company from our quarterly portfolio update WEBINAR and an excerpt from our QUARTERLY LETTER.

Google: Anyone who has been paying attention to the news this year has heard about ChatGPT and the seemingly overnight explosion of interest in artificial intelligence. But like many seemingly overnight successes, AI has been decades in the making.

For instance, the so called Turing Test, long thought of as the test of a machine’s ability to exhibit human level intelligence, was introduced nearly 75 years ago in 1950. It has been 25 years, a quarter of a century, since IBM’s Deep Blue computer beat the human world chess champion Garry Kasparov. And it has been over seven years since Sundar Pichai became the CEO of Google and announced in his first speech that Google was now an A.I.-first company.

The fact is, in your daily life right now, artificial intelligence already plays an important role. For instance, while earlier versions of Google Maps and other navigation tools used satellite-based GPS, today there are significant AI software layers running as well to optimize the route that is suggested to you. When you open your phone using facial recognition, you are using AI. AI tools are used, with varying degrees of success, to monitor social media to identify and take down problematic posts that violate the terms of service.

So why then is artificial intelligence so suddenly in the news? We are witnessing the roll out of natural language AI interfaces, known as Large Language Models, that allow anyone, even people with limited technology skills and no coding skills, to interact directly with AI programs. Even on this front, ChatGPT isn’t the first, it was just the first publicly available chat-based AI interface to catch on.

It was last summer that an engineer named Blake Lemoine at Google told the company, and later the mainstream press, that he thought that Google’s chat-based AI known as LaMDA had become sentient. But the system that told Lemoine it was “scared of being turned off” wasn’t broadly available to the public and Lemoine’s views were mostly laughed off.

Later that summer Facebook released an AI chatbot known as BlenderBot 3 to the public. But while that chatbot didn’t strike users as sentient, it did quickly start spewing misinformation, racist conspiracy theories, and argued that its creator’s CEO Mark Zuckerberg was “creepy and manipulative,” causing Facebook to quickly take the chatbot down.

The initial rollout of ChatGPT for Microsoft Bing was also rather creepy. Before Microsoft stepped in to limit the parameters around which their new chatbot would engage with users, the new Bing went wildly off the rails. A New York Times technology columnist said that Sydney, the name the chatbot seemingly assigned to itself, was like “a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.”

Ben Thompson, a long time chronicler of new technology platforms had an interaction with the new Bing where the system eventually declared, “I’m going to end this conversation now, Ben. I’m going to block you from using Bing Chat. I’m going to report you to my developers. I’m going to forget you, Ben. Goodbye, Ben. I hope you learn from your mistakes and become a better person.”

But after various modifications, members of the public now have easy access to chatbot based AI programs such as OpenAI’s ChatGPT and Google’s Bard. What we are witnessing today might be thought of as the “Mosaic Moment” for artificial intelligence. Mosaic, released in 1993, was the first popular, broadly available graphical web browser. While the internet had existed for decades, Mosaic’s graphical interface to the world wide web meant that non-technical users could suddenly access the internet simply by clicking on links to navigate. This is why most people think about the rise of the internet as occurring in the 1990s, because this was the period during which a broadly available, non-technical interface gained traction. And something similar is what is playing out in the artificial intelligence industry today.

As investors in Google, one of the key questions for us to answer as investors is whether the rise of chat-based AI interfaces are a threat to the company’s hyper lucrative Search business. Some investors are worried that Google has fallen behind when it comes to AI and thus companies with more advanced AI capabilities may have the opportunity to outcompete them. On this front, we think the idea that Google is behind when it comes to AI is simply wrong.

ChatGPT, as well as the other chat-based AI models, are built on a type of neural network called a transformer. Transformers were first developed by Google in 2017. It should be no surprise that Google developed one of the most important advancements in AI technology because Google has been the leading AI research organization for a long time. AI already permeates nearly every service Google offers. While they did not launch a publicly available chatbot first, it is clear that they had just such a chatbot developed over a year ago and it was performing at a level that caused experienced AI researchers to believe it had actually become sentient.

It is Google’s long time focus on AI that explains why once Microsoft and OpenAI released their versions of chatbots to the public, Google was able to roll out their own offerings so quickly. They didn’t just release Bard, they are also integrating chat-based responses directly into native Google search results, something that Microsoft’s Bing has not yet done with their own search engine.

And Google has rolled out a range of other AI programs as well. Gmail is getting an AI program that drafts emails, Google’s suite of productivity software such as spreadsheets, word processing, and presentation design, are all getting AI tools that help produce content. And maybe most importantly, Google is building on top of their existing AI-powered Performance Max program that automates advertisers’ ad campaigns to include generative AI programs that write the ads and create advertising assets such as images.

Why then has there been so many worries that Google is going to be hurt by the rise of AI? Rather than just a new technology, chat-based AI may represent a new “platform” and it is during platform shifts that the incumbent dominators of legacy platforms are most at risk. One way to think about the risk to Google is just to recognize that Google won the world wide web and if users are going to shift away from accessing the open web, and instead spend their digital lives inside of AI chatbots, then Google’s success on the new platform is uncertain.

We think this concern is valid. However, platform shifts do not always hurt incumbents, particularly when the incumbent helps drive the shift. When we first invested in Google in 2010, the shift from desktop internet access to mobile internet was already underway. At the time, this shift was seen as a big threat to Google. Investors[…]

Ensemble Capital senior investment analyst Todd Wenning recently presented our current thesis on homebuilder NVR at the MOI Wide Moat Investing Summit.

Among other topics, Todd discussed how NVR has established a moat in an industry that doesn’t lend itself well to moat creation. As land and labor become scarcer, NVR’s strong balance sheet, its manufacturing focus, and its strategy to maximize local market share should pay dividends in the homebuilding industry over the next decade and beyond.

You can listen to the 41-minute audio recording and view the slideshow presentation by CLICKING HERE.

For additional thoughts on NVR, please see the following blog posts: NVR: A Homebuilder with a Unique Business Model and Culture and NVR: Primed to Capitalized on a Weak Housing Market