On today’s episode we share our First Thoughts on OpenAI's Weekend. The host, Jeff Roster, deep dives into the major shifts and changes in the AI industry, specifically revolving around OpenAI's recent events. He is joined by guests Brian Sathianathan and Gautham Vadakkepatt, who are industry experts. They discuss the rise of Artificial General Intelligence ( #AGI ), OpenAI's achievements, its changes in leadership, and its corporate structure. They also delve into how Microsoft's involvement might shape the future of AGI and discuss how businesses can strategically leverage AI technology moving forward.
Give it a listen and let us know what you think?
#thisweekininnovation, #TRI, #5ForcesOfInnovation, #podcast, #retailpodcast, #emergingtechnologies, #Retailers, #retail, #retailindustry, #retailtechnology, #retailtech, #futureofretail, #innovation, #innovationstrategy, #retailinnovation, #retailtrends, #retailinsights, #retailnews, #retailtech, #DigitalTransformation, #VentureCapital, #VC, #Founders, #Entrepreneurs, #startupstrategies, #startupfunding, #startupstories, #startupsuccess, #startupfounders, #retailstartups, #founderstories, #founderlife, #Gartner, #IHL, #ArtificialIntelligence , #AI, #cloud, #data , #deeplearning, #naturallanguageprocessing , #sentimentanalysis , #conversationalai, #InternetOfThings, #IoT, #machinelearning, #Blockchain, #virtualreality, #augmentedreality, #personalization, #datamining, #SaaS, #Recommendations, #QRcodes, #Robots, #vr, #3d, #ar, #xr, #NFTs, #unifiedcommerce, #socialcommerce, #mobile,
[00:00:04] Jeff Roster: Well, hello
[00:00:04] everyone. And welcome to this very special edition of This Week in Innovation. This is what I've been waiting for. We have an emergency podcast. Let's talk about all the activity that's happening, uh, over the last weekend. And this is probably one of those podcasts that will not be evergreen.
[00:00:18] It's, uh, uh, literally it's changing is, is. is, is I'm, I'm writing the script for, for today's conversation. And so to analyze what's happening with, with, uh, open AI, I have two, uh, old friends, Brian and Gotham. Uh, Brian, why don't you introduce yourself to this audience? Since, since we're not talking about retail now, we're really talking about AI holistically.
[00:00:36] We might have a whole lot of new listeners. So Brian, why don't you go ahead and introduce yourself.
[00:00:40] Brian Sathianathan: Hi, I'm Brian Satyanathan. I'm the co host with Jeff Roster, but today I'm playing a different role as an expert on this topic. But, um, I'm also, uh, a CTO and co founder of, um, a low code AI company called iterate.
[00:01:03] Gautham Vadakkepatt: Hi, uh, Gautam Vadkeppad. I'm an associate professor of marketing. at the University of Central Florida. I'm an emerging tech and sustainability enthusiast that's been tracking the evolution of AI in general for the past few
[00:01:17] Jeff Roster: years.
[00:01:19] Fantastic. So as everyone knows, uh, hopefully everyone knows, um, this past week and other than watching my beloved San Francisco 49ers absolutely dominate in the game, uh, at the same time I was watching that, I was reading my Twitter feed and all this craziness that was happening. So I'm just In the Formula One too, Jeff.
[00:01:35] Don't forget that. Oh, that's right. In
[00:01:37] Brian Sathianathan: Las Vegas. Must have been winning it. Yeah. I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm, I'm,
[00:01:44] Jeff Roster: So, so anyway, very busy weekend for all three of us, but, um, obviously a lot of stuff happening with open AI. And so we thought we'd get together and have the, have a quick conversation about what it means.
Brief History of OpenAI
[00:01:54] Jeff Roster: And I thought before we get started with, with the conversation, I'd kind of go back and just maybe frame in for, uh, [00:02:00] some of, uh, some of the listeners that aren't necessarily living, eating, sleeping AI, the way all three of us are. And so I thought. The best way to do that would be to just to recap a little bit of the history of this particular company.
[00:02:10] Um, and just thinning down, uh, a tweet from, um, Chyamath Pappathia, uh, who I thought just really summed it up. Well, uh, it's, his tweet is well worth the rain, by the way, his subscribing to his Twitter feed is also worth it. Even at nine bucks a month. Um, so let me go ahead and just kind of set the stage for the conversation.
[00:02:27] And this is all his words. I've edited it down dramatically, but there's not, there's not an individual word for me, it's all from, from Chamath. OpenAI was initially founded in 2015 by Sam Altman, Elon Musk, Ilya Sukhiker, and, uh, Greg Brockman as a nonprofit organization with the stated goal of advancing.
[00:02:46] Uh, to advance digital intelligence in a way that was most likely to benefit humanity as a whole.
OpenAI's Transition and Challenges
[00:02:52] Jeff Roster: In 2019, OpenAI transitioned from a nonprofit to a capped profit model. According to the company's blog post, [00:03:00] OpenAI wanted to increase its ability to raise capital while still serving its mission and no pre existing legal structure they knew of struck the right balance.
[00:03:09] That's a quote, uh, quote unquote, open AI came up with a novel structure that allowed, uh, the nonprofit to control the direction of a, of a for profit entity while providing its investors a capped upside of a hundred X. This culminated, culminated in a 1 billion investment from Microsoft marking the beginning of a key strategic relationship.
[00:03:32] But complimenting or complicating the company's organizational structure and incentives, the nonprofit entity OpenAI Inc. became the sole controlling shareholder of a new for profit entity, OpenAI Global LLC, which answers to the board. Uh, of the non profit and retained a fiduciary responsibility to its company's non profit charter.
[00:03:56] Crucially, the board was responsible for [00:04:00] determining when OpenAI attained artificial general intelligence, or what we'll all start talking about. AGI, which the company defines as highly autonomous system that outperforms humans at most economically valuable work. Skipping down further. In 2020, bolstered by its new funding, OpenAI, uh, unveiled CHET or unveiled GP T3, a large language model capable of understanding and generating convincing human like text.
[00:04:30] December, 2022 marked another major milestone for OpenAI with the release of GPT 3 laying the groundwork for the consumer focused application, ChatGPT.
OpenAI's Recent Developments and Controversies
[00:04:42] Jeff Roster: On Friday, OpenAI announced it was removing its co founder, Sam Altman, as CEO, citing the lack of consistent candor for his communications with the company's board.
[00:04:52] According to the company's official statement, the board no longer has confidence in Altman's ability to continue [00:05:00] leading open AI. Elon Musk had resigned his board seat in 2018, citing a potential future conflict of interest with Tesla's AI development for driverless cars. Um, the remaining board members that remove Altman are Sam D'Angelo, CEO of Quora, Tasha McAuley, CEO.
[00:05:21] Co founder of Fellow Robotics and Adjunct Senior Management Scientist at RAND Corporation, Ilya Sukhikar, Co founder and Chief Scientist of OpenAI, Helen Toner, Director of Strategy and Foundational Research Grants at Georgetown University Center for Security and Emerging Technology. Altman's removal as CEO prompted the resignation and president and co founder, Greg.
[00:05:44] Brockman and three of the company's senior scientists. And then, uh, uh, Chambliss, uh, conclusion while, while the details of Altman's removal are still unfolding. And that's as of this writing, which was last, uh, which was Sunday afternoon unfolding is becoming [00:06:00]increasingly clear that open AI's convoluted corporate structure led to, uh, conflicting motivations and, and incentives within the company.
Discussion on OpenAI's Future
[00:06:10] Jeff Roster: So gentlemen, going over to you, what. Oh, actually, before I do that, let's, let's just, um, so that's, that was Sunday morning. Since then, uh, the whole idea that possibly Sam might regain his control, uh, was, was kicking around, I think, probably, what, Saturday night, Brian? And then into Sunday morning, that's what this Wall Street Journal was, uh, talking about.
[00:06:31] Then that was rebuffed. Um, our new, our new interim CEO is Emmett Shear, a little talk about that.
Microsoft's Role in OpenAI's Future
[00:06:39] Jeff Roster: And then the big, I think to me, the big thing, which is what we're really going to spend most of our time talking about was the role that, that Microsoft came in and playing. So came in and acknowledged, uh, that, that, um, I don't know, how would you describe that, Brian?
[00:07:01] Brian Sathianathan: I think Microsoft, uh, is of course, I think, uh, they, they are still very committed to the journey and I think they wanted to have Sam, uh, Sam and Greg join them. And then of course they ended up joining Microsoft for this new division.
[00:07:15] Jeff Roster: Okay. Yeah, perfectly said. I was, I was struggling to think about what I was going to say there. So that being said, that's where we're at. Um, first of all, Brian, any, you know, obviously I, I just went through mountains of iterations of, of things, uh, a little bit of the history and where or not anything I missed that would help for the, to set up for our conversation.
[00:07:36] Brian Sathianathan: Uh, while all this thing is happening, I think they, I just saw on LinkedIn and a few other posts that, uh, on some blogs as well, 700 out of some 750 of the OpenAI employees have sent a letter to the board asking them for changes, right? Which means there could be, uh, you know. Uh, people leaving the company and all kinds of changes that are about to come pretty soon.
[00:08:05] Jeff Roster: And I just, you know, Brian, to be honest, I saw that come through and I was in the process of screen grabbing that and something else happened. And then it's like, yeah, you're also that. I mean, that's huge.
[00:08:14] Brian Sathianathan: It's unraveling really fast, but I think Jeff in, in our coming questions, we'll talk more about it.
[00:08:19] I think. This whole, you know, public, like this, this whole, you know, uh, non profit owned board, uh, of, uh, owning subsidiaries that are profit and the subsidiaries being kept is an interesting structure for research. But it also poses some structural problems. We'll continue to talk about that as we go
[00:08:35] Jeff Roster: through.
[00:08:35] Okay, so yeah, so good addition. I meant to add that in, but that's my bad. Um, so obviously the first question, Brian in Gotham too, if you want to jump in on this.
Implications for AI Strategy
[00:08:44] Jeff Roster: How does this shape the future of OpenAI and AGI, which is Artificial General Intelligence, or AGI?
[00:08:52] Brian Sathianathan: Um, I can take that and then I can, uh, have Gautam jump in as well, but I think at a higher level, Um, You know, as you know, [00:09:00] AGI, Artificial General Intelligence has been growing really rapidly.
[00:09:03] Um, by some numbers that I saw it, it has grown almost 3000x from 2018, right? That's how fast it's growing. Uh, which I, which, which, which is why, uh, uh, like companies like OpenAI were actually structured to be nonprofit, uh, uh, companies with, uh, owning, fully owning. Own subsidiaries of profit, kept profit companies underneath the holding structure that they've had.
[00:09:26] Right. I think at a higher level, what's going to happen is I think in my mind, I feel, feel like it's almost like a bell curve, right? There is AI that the consumers need that consumers would love to have. And then there is AI that enterprises would like to have, right? You're almost thinking about like a bell curve, right?
[00:09:44] At the very top of the bell curve is the optimization where it's just enough for the enterprise and it can solve most of the use cases. If you go on the other side, on the right hand side of the bell curve, you'll actually come down and you'll end up creating job losses. So there is always this, this conflict [00:10:00] between a full AGI and an AGI that's just necessary, uh, to be commercialized and to be rolled out.
[00:10:07] See, I think, you know, in the long run, it's hard to say, but my speculation is I think OpenAI will continue to remain a research company. There might be quite a lot of restructure there, but maybe, uh, much, much smaller than what it is. are eventually taken over by other entities. Yeah, it's hard to really predict, but I think, uh, it's, uh, you know, I think this also shows structural problems in this type of structures as well.
[00:10:30] You know, we are non profits, owning profit companies, because these companies have a problem, uh, is that they all need a lot of servers and a lot of GPUs. So they need money to keep the open, the non profit going because, you know, there is no, because Uh, the profit that comes out of, you know, something like a chat GPT or any of these applications are not enough to sustain the infrastructure costs, right?
[00:10:53] So that's why you need these things. But I think there are two things that's going to happen. One is that, uh, future founders of these [00:11:00] AI labs and their potential investors or donors are going to think hard about what the new structure is going to be. Number one. Number two, I think, you know, corporate investors like Microsoft or Amazon or people folks are going to think about.
[00:11:14] What does this holding companies look like and how much control do we have in them in case issues like this happen? What does that look like? Right? Because if you look at the open AI that the or the AGI license I don't think Microsoft had license on the open AI's AGI based on what I read in public I mean private contracts might be different.
[00:11:31] They had license on the chat GPT software and so on So the control structures are going to change significantly, right? Uh, so that's the, those are some of my point of views, uh, Gautam.
[00:11:43] Gautham Vadakkepatt: Yeah, I, I can't agree more. I think one from a corporate perspective, how company, this is corporate venturing, right? So how do they actually think about these organizational structures is going to be rethought.
[00:11:55] Companies are going to probably. Take a more diversified [00:12:00] approach to doing this and perhaps seek more control in some of these structures. But it's a hard challenge. Companies need to raise for the startups, they need to raise money and infrastructure to kind of get to this. state, but on the downside, that means there are some trade offs that are associated.
[00:12:18] So from a corporate perspective, I think they'll have to rethink. From a startup perspective, they'll have to rethink the structure. Where I see things going in the future is that there's going to be a little bit of a pause as companies rethink the way they actually integrate these models into their own products and technologies to really understand what's the ramifications from this.
[00:12:45] Um, they're going to let this play out a little bit more.
[00:12:49] Jeff Roster: Brian, I think you said, uh, if I heard you correct, that open AI would be smaller. Do you mean certainly, certainly their market cap is going to be radically smaller, but do you mean less total [00:13:00] associates working in the research? Cause how many, do you know what their headcount was?
[00:13:03] I know what your 750, um, have signed a document, but I actually don't know what their headcount was. Do you, do you, uh,
[00:13:10] Brian Sathianathan: I'm not sure, probably under the thousand, but we can easily
[00:13:13] Jeff Roster: find that. So, so, so do you think it'll be the, the, there'll be fewer numbers of actual headcount than when you say the company's going to be, or the, whatever we're calling it a company, I take it right, or a lab.
[00:13:26] Brian Sathianathan: No, I think, I think that it could be a smaller company from a headcount as well as much more, much more focused on certain AGI tasks. Opposed to enterprise task, and that could be that things could be broken. These are all speculations. Of course, you know, we, we are speculating. We don't know. It could be that, you know, the enterprise side could be handled purely through Microsoft and so on.
[00:13:45] Uh, the, the ai, the AI side, the AGI side of the research would continue as being still being a smaller company, right? Mm-Hmm. and, uh. And there is also higher chances everything would just blow up and everything could come [00:14:00] down and then eventually the, the, the, the enterprise site would just be owned by Microsoft.
[00:14:05] So you never know. I mean, it's very hard to tell
[00:14:07] Jeff Roster: these things. Well, there was a lot of moving pieces from Friday night when I was minding my own business too. to today, Monday, Monday morning, uh, the amount of the amount of movement is something I've never seen. Um, other than the only thing, Brian, that I can, in Gotham, the only thing I can even think of is when Mark Hurd got pushed out of HP on a Friday afternoon.
[00:14:25] And, um, a lot of spec, that was when I still bike at Gardner and a lot of speculation as to what was his, what was going to happen with him. And he was working for Larry Ellison at Oracle within a day and a half of that. So that's the only, that's the only example. Also on a weekend that I can never think of of this amount of and and that situation is a rounding error compared to what this is. Brian, where do you think Microsoft will go with AI, you know, based on these new appointments of Sam Altman and Greg?
[00:14:54] Uh, I think
[00:14:55] Brian Sathianathan: it's a great question. Uh, Jeff, I think they are, they are very committed to the, to this [00:15:00] journey. So that's one of the reasons I think, you know, they, they did what they did. Right. Um, I think at a higher level, there are a couple of things to look at. First, of course, you know, Regardless of how it all turned out, it's still somewhat of a damage control and a little bit of a, uh, instant perception of instability equation that Microsoft has to somewhat handle, right?
[00:15:19] Because, you know, there is a lot of investment on, on, on this, uh, on the, on the large language model area. Uh, second, I think is, uh, you know, if you look at like, just like how other clouds are playing like Google and, or like even, uh, Amazon, even though they have some. Investments, uh, in some of the larger language models.
[00:15:37] They are also playing a significant sort of a Switzerland approach where they're supporting, uh, trying to support a lot more large language models. So, um, I think, uh, now I think Microsoft also supports Llama too, and some of the others as well. So, so there will be somewhat of a Switzerland approach that Microsoft will begin to take.
[00:15:55] Um, and then I think in the background they will begin to develop, uh, what then, what, what the next [00:16:00] phase of this thing looks like, right. In terms of AI. So I think that's what's gonna happen. Uh, uh, those two actions would probably happen with some near term damage control and perception of instability, uh, you know, uh, kind of controlling some of that stuff because I think Microsoft businesses with enterprises, right?
[00:16:18] Yes, they do have a big consumer business too, but I think this will most likely, you know, will be a big enterprise place. So I think, uh, that the messaging to the enterprise and CIOs. In terms of the stability of all these things is very, very critical.
[00:16:33] Jeff Roster: Let me ask you this as a follow on, um, two months from now, do you think anybody other than the people like you that are deep, deep into the weeds around all things AI, do you think we'll, we'll even notice what happened here?
[00:16:46] Will it matter to us? The, you know, the average, not just the average consumer, but the average tech person who's just, who's still moving fairly aggressively along an AI strategy?
[00:17:07] Uh, and it's going to continue for the coming years to come. Every generative AI is top of mind for every board member, right? And then every CI and it eventually trickles down to every CIO and every, every other leader in the company in terms of, you know, what is your priority? What are you doing? So they're going to think hard about a strategy in terms of how do we build that something sustainable because.
[00:17:29] In the, because LLMs and generative AI is not going away. It's going to be there. Now, the question is, how do you build something, uh, you know, in a, in a world of shifting sand moving so fast, how do you build a layer on top of it that gives you sustainability, right? I think a lot of the leaders will begin to think about that.
[00:17:47] How do we, uh, which means they will think about non vendor lock ins, how to, how to build, how to have the ability to sort of, you know, switch between these things in case something happens, right? All those things will be thought through. [00:18:00] Um, in, in terms of, you know, uh, what the AI strategy is.
[00:18:06] Jeff Roster: So Gotham, I was, uh, as I was, uh, going through all the research this morning, I came across something that was published by a, uh, by a, uh, professor that I'm, uh, very, very closely associated with.
[00:18:17] Oh, it happened to be you. Um, I was surprised at how fast actually some academics got some, some content on this related, uh, published. Um, do you want to unpack this, uh, some of your thoughts here in this piece? And for the, uh, for the listeners, let me just read it. Um, actually, unless you want to. Unless you want to read it or recap what you said and then, and then respond to it.
[00:18:36] Gautham Vadakkepatt: Um, so my perspective is that Microsoft, I mean, Microsoft has emerged the winner, right? Like there was mistakes made by OpenAI in the past few days that has led to this point. And Microsoft has emerged. As a winner by being able to get Sam Altman and Brockman to their team, uh, and [00:19:00] basically light a fire under what has already been stated publicly to be one of their core things moving forward, right?
[00:19:08] So to that extent, I really think that, uh, this ties in very nicely to your last Brian, where he thinks Microsoft is going to head with this. This ties in very nicely with the future of Microsoft. In the short term, sure, they're going to try and mitigate, like they're going to put some bandage around this issue, uh, try to stem the flow.
[00:19:29] In some ways, do some reputation correction for ChatGPT and OpenAI. But in the longer term, you're going to see them leverage this technology into their applications. They have applications in healthcare, they have applications in retail, they have applications in finance. So you can see them build out these technologies within those verticals at an enterprise level, right?
[00:19:53] My, my perspective is that my longer term point of view is that OpenAI did something that was really [00:20:00] amazing, right? There was a reason why they got fastest to the hundred million DAUs and it is because of their structure. They were a non profit who, as you read earlier in the In the charter, they were supposed to generate that safety thing that allowed people to have confidence in the technology.
[00:20:18] Look, GPT, Generative Pre trained Models, were built by Google, right? There was a reason why it was not pushed out by Google. We needed open AI or something to make this go mainstream. And that non profit structure allowed it to be mainstream. But the downside is that now... When you tie in that money that is needed to make the succeed that Brian talked about, they had to come up with a creative structure and the tensions between the two kind of eventually had led to some of these issues.
[00:20:52] And that's going to be the longer term issue that everyone has to deal with. How are we going to actually come together to [00:21:00] deal with the safety aspects, the ethics aspects, How do we actually develop AI at a fast rate and how do we build the infrastructure that makes this cost effective, right? Um, and that's where I think that balance is where I think.
[00:21:17] We're going to head in the future and I think we're going to companies are going to take a pause To think about this a little bit more critically and you're asked to Brian. Hey, do you see? What what do you see? What's the people not going to talk about this in the future? I think they are going to talk about this for a longer term
[00:21:46] Jeff Roster: Um, if you could go back in time to 2014 and advise the start, you know, open AI as it was building out, would you have them do something different or would you sort of let the, I mean, They were [00:22:00] successful. Um, amazingly, unbelievably successful. And maybe that's because as you referenced, it was, it was this particular model.
[00:22:08] Would you do something different or would you just sort of let this whole play out and then just clean up the mess?
[00:22:13] Gautham Vadakkepatt: My, my perspective was that like, I actually thought that this structure was needed, but I think they needed, they needed the nonprofit to actually build trust, right? There was all these concerns. I was one of those people and I'm still out. Yeah, for sure. Yeah, right. About AI. And so the non profit part, allow me, I played with Chachipiti's version and not with the other versions, right?
[00:22:39] For that reason. And obviously not really a good thing to do and Brian will tell me that, but on the, they needed that to get going, get the consumers to adopt this. They needed that structure to be that cap profit structure to raise the money. Think about the infrastructure that's needed to get this going.
[00:22:59] How do you [00:23:00] balance the for profit with the safety part? And this is part of the evolution, I think, of... The industry this and I hope this debate continues because while this is a moment in time, we need to actually think through this very deeply to really understand where we're going to head and how we can restructure.
[00:23:19] I don't have an answer, but I don't, I can't blame what they did in the past. I think that was Creative and perhaps the right thing to do. Yeah. But I'm sure change my point of
[00:23:30] Jeff Roster: view. Yeah. Well it's, it's, that goes with your profession. I, you, you kinda like me as an analyst, we can, we can change in a heartbeat on, on whatever we say, whatever hot take we have at the minute.
[00:23:39] But I, I, I tend to, I tend to kind of agree with you. I, I, I, um, you know, it's, uh, I, I don't, I don't know that we would've gotten to the point where we're at in adoption without having the, the format they had and, and, you know. I think when it's all said and done, I think the, uh, the mess over the last three or four days has cleaned up fairly, fairly quickly.
[00:24:00] Brian Sathianathan: I think the other thing, Jeff, that could also happen is that the structure could still remain, um, and continue to be used, but it could be modified, right? The challenge, I think, in the structure is there is the, the, the board, the public board completely controls and own everything that happens in the holding company that actually owns the.
[00:24:18] The private company, which all these companies invest into, right? So it's, it's like a, it's like, it's like the traditional old school venture capital. It's called the three prong structure or the four prong structure. But, but, but the, instead of the partners owning the structure, it's actually the non profit, uh, company owns it, right?
[00:24:36] Which I think is an interesting thing, but there might be lots of different, um, you know, in legally you can create, you know, different types of flows and hooks and both contractual as well as, you know, certain types of, you know, equity arrangements to make it work. So what will happen is a lot of people will think, think about model.
[00:24:54] There might be the structure in general might exist, but there might be modifications of that, right? [00:25:00] I think. Uh, I'm not quite familiar with the Anthropic model, which is another company, it's kind of very similar, but they also have like a long term benefits, uh, benefit, uh, trust built into, into the structure as well.
[00:25:12] So there is all kinds of interesting things that could actually, you know, come into play in terms of how all these things can play out. And I think in the future, what's going to happen though, like, as I said before, corporate, uh, investors, these big companies are going to think long, hard about it. Right.
[00:25:27] In terms of, you know, my investing into something that I have enough control, right. That question will, will, will continue to happen.
[00:25:35] Jeff Roster: So more work for the attorneys and the consultants. That's good.
[00:25:39] Brian Sathianathan: Yeah. I mean, a lot of these things in the past have been done in other industries. It's not like it's impossible to do.
[00:25:44] It's just that, you know, when something happens, you know, everybody will like, it's just like you have Sabin Soxley. Now you have all these security controls. A lot of these things will, will, will continue to happen.
[00:25:54] Jeff Roster: Yeah. Final question, Brian.
Advice for AI Strategy
[00:25:56] Jeff Roster: What advice would you have for our listeners and our viewers, um, [00:26:00]now that we're, uh, active on YouTube, uh, for their, uh, for their AI strategy, given what we've had saw in the last three or four days, or, or is there, is there any change that you would suggest?
[00:26:09] Brian Sathianathan: I think the first advice doesn't change. If you're a senior leader, think about use cases that are applicable to your organization and to your company. And begin to use it, right? Use, leverage and take advantage of generative AI as much as possible. I, I don't want any of the, what's happening now to deter your vision because this is all long term, you know?
[00:26:30] So your long term views are not going to change and the world is not going to change, right? The cat's already out of the bag, right? So, so first, but the, but the advice is like, you know, think about use cases and think, don't look at generative AI as a shiny object, right? Because. It's here to stay. Just, you know, that's how, you know, a lot, a lot of times new technologies come in and leaders are like, Oh, well, this is cool.
[00:26:51] But then it's a shiny object. It goes away, run a bunch of marketing campaigns. This is not that right. This is kind of very omni omnipresent. It'll, it's going to [00:27:00] be there for a long time. So think about use cases and how it can help your business. Because I think that's where the first advice is. Second advice is, um, as you're thinking about these things, uh, think about, uh, Um, how you can build a platform or take advantage of platform where you can experiment with multiple large language models and multiple solutions from multiple providers, right?
[00:27:24] Because the open source in this world is also getting really strong. Right. I mean, uh, Facebook, uh, Meta's Llama 2 got released. I'm sure, you know, there, there might be newer versions in there as well. Uh, and then there is another really cool one called Mistral. ai, which is another really interesting open source, fully open source, uh, faster than Llama, very advanced, state of the art, uh, large language models.
[00:27:48] So a lot of these things are coming in. So as a senior leader, think about, you know, uh, keeping doing and working with platforms where you can take advantage of many, several models and you have the ability to [00:28:00] switch, right? Don't kind of get in locked in with one thing. That's kind of the second advice that I would.
[00:28:06] Third is think about like private, like take your private data and train a lot of these LLMs, right? Because I think take, you know, taking advantage of your, build an IP mode for your business, right? If you look at, you know, like the attention economy, which we've spoken a lot and a lot and a lot about, right?
[00:28:23] Uh, so like, you know, if you have like 24 hours a day, I think all these, the big posts, the Google, Apple, Amazon, Facebook, all their collective technology probably only owns under six hours of your time. Lifetime in a given day, right? That's actually a lot, right? Really a TV. It just used to be like 45
[00:28:41] Jeff Roster: minutes.
[00:28:42] Feels like a lot to me,
[00:28:43] Brian Sathianathan: Brian. That's a lot, but guess what? There is another two to three, four hours that you're interacting with traditional companies, right? When you go pump gas, you're going to the gym. You are, you know, you're getting a haircut, right? You're getting your car repaired. You're not getting that done in with Google.
[00:28:59] You are getting [00:29:00] that done with any of the players, any of the retail leaders that we all work with, right? So think about your data. It's still very precious and very special. So try to use your data in a very private, secure manner and train LLMs. Build IP modes for your business. So those are, those would be my three.
[00:29:18] And Gautam, you know, feel free to jump in.
[00:29:21] Gautham Vadakkepatt: Yeah, if you don't mind, I will jump in. I think the first thing would be for leaders is to really understand the technology, right? To understand where it's heading, assess the risk and the impacts, right? I think that's. Diversify those risks away, which would be the first takeaway.
[00:29:37] The second is for me, from my perspective, is to build trust. Uh, and you know, you ultimately success predicates on building trust, right? Between partners, between the consumers. And the last aspect, I know you talked about data and building data modes, but. I'll take a, take it one step more [00:30:00] basic and show the quality of the data.
[00:30:02] Uh, right. Ultimately, before you have a clear AI strategy and where generative AI fits into those, uh, aspects, but at the essence of it, it is the quality of the data that you deal with. And if you're able to actually derive high quality data protected, then you're able to build trust. You're able to build use cases.
[00:30:26] That actually means something to the consumers, which in this case mostly can be enterprise consumers. Uh, and you're able to continue, uh, the process of adapting to a rapidly changing environment. So those would be my three big things. But to collaborate wisely might be the more short term take away to diversify your
[00:30:47] Jeff Roster: risks.
Conclusion and Final Thoughts
[00:30:48] Jeff Roster: And my final piece of advice, of course, would be to follow This Week in Innovation, uh, for all your AI, uh, learnings, uh, like and subscribe, as the kids would say, and, uh, wow, what a journey, [00:31:00] Brian, we've been, we've been going down this AI road now in this pod, really from the get go, we've, that was one of the core foundational, um, technologies we want to talk about and, uh, not to be, I mean, wow, what a, what a crazy weekend we've all had.
[00:31:12] Um, Gotham, where can people get ahold of you, um, and follow your work? Thank you. Probably LinkedIn, I think is probably your best place. LinkedIn
[00:31:20] Gautham Vadakkepatt: is the only place. The only place. I am, I am an anti social person. So LinkedIn is there for
[00:31:27] Jeff Roster: me. Okay. And Brian, how about
[00:31:31] Brian Sathianathan: yourself? Yeah, well, you know, you can, you can get hold of Jeff.
[00:31:36] Jeff Roster: I'm Brian's secretary. That's okay. No, no, no, no.
[00:31:39] Brian Sathianathan: What I mean to say is like... Well, for this week in innovation, you can go to my website and
[00:31:45] Jeff Roster: all that, all that will be in the show notes. Well, gentlemen, thank you so much for, for jumping on this, this first emergency pod to respond to legit breaking news. A lot of fun shopping up with you guys.
[00:31:55] Uh, see you on down the road.