The Emerging Market Equities Podcast

Deepseeking – China’s LLM leap

abrdn

In the latest episode of the Emerging Market Equities podcast, Nick Robinson sits down with Pruksa Iamthongthong to take a deep dive into the current world of AI.

Nick: Hello everybody, this is Nick Robinson from abrdn and you're listening to the Emerging Markets Equity Podcast, the show that explores the factors that underpin our thinking on emerging markets. We ask our expert guest the big questions from key individuals to evolving trends, all with the goal to identify and profit from opportunities in the region. This month, we revisit a topic we last discussed back in 2023, and it's been a key driver of markets ever since. We're going to come back to artificial intelligence and its impact on emerging markets. So, in 2023, we were digesting the arrival of large language models and grappling with how the development of these would impact the ecosystem around them. From the chips used to supply the huge processing power needed to the impact that they could have on the companies that use them. With the benefit of a bit of time passing, we now have a better picture on how AI adoption is unfolding and what it means for the outlook for companies that use it. So, I'm delighted to be joined today by my colleague, Pruksa Iamthongthong who is Deputy Head of Asian Equities based in Singapore and also one of our Asian technology experts. Pruksa, welcome back to the podcast. It's great to have you back on. 

Pruksa: It's great to be back, Nick.

Nick: Brilliant. Well, let's get cracking. So like I said, it's been a while since we last talked on AI, and I think back then, yeah, we approached it more about the picks and shovels of LLMs and how to play the data centre investment. Perhaps you could talk a bit about how our understanding has evolved since then? 

Pruksa: Sure. It has been, a very exciting and, interesting one year where you can imagine a lot of things have changed, and yet certain stuff, hasn't really changed. So just to take a step back, when we talk about the picks and shovels of LLMs, we are just talking really about one of the layers of the ecosystem. So, if we were to recap, you can see that there are three main stages of AI ecosystem. The first layer is what we call the foundational model system. So, this is your type of LLM which stands for large language model. The second layer is where we need a semiconductors on the infrastructure as well as, you know, the data centre investments that are really the investment that helps to build the foundational models. And the third layer is really the application and the software that sits on top, that interacts with the consumers or the enterprises that the foundational models built upon as well. So, where we are today, I would say that the three stages of AI ecosystem hasn't changed. Obviously, there are, evolution within that. When you talk about the foundational models today, we are talking about, from a pre-training, from a very large language model to seeing more of a smaller model, more of a reasoning model and we can dive into the details about that later. Certainly, a lot of things to talk about. The second layer is really about the semiconductor infrastructure here again. Within Asia we talk about, semiconductor advanced chip supply chain. Including things like the memory. You might have heard of a term called high bandwidth memory. So, these are really advanced and, high spec memory that allows a lot of storage, to happen. You have an enablers, simply because when you, building a more powerful server, you also need, to have a more complex power system. You do need to have a better cooling system because that generates a lot of heat. You also need a lot more electricity, and that extends into investment into, you know, outside of tech, which is something like the electricity grid or the transformers in that particular case. And that has been, I think, one of the overall broader upgrade trend that you see within the semi-infrastructure and, all the way down to the electricity side of things. And then you have the third layer, which is more of the software application layer, the IT companies, the software companies that are the ones that interact with you. So those are the shift that has been happening. And I think, looking back, I would say the last two years have been really about the first two layers, the foundational model and, picks and shovels of the semi and infrastructure. But I think increasingly what we are seeing is that we are starting to see the emergence of the third layer coming up towards the end of last year. And I think this also ties into the trend that software globally, particularly in the developed markets as well, have been, going through a bit of a cyclical downturn and we are just hitting that inflection point. And as we find more use cases of AI and as the cost of AI adoption decreases, you should expect to see the adoption, of that within software come up. So, I would say within that, certainly the third layer has become more interesting for us in terms of opportunities as well. 

Nick: Yeah, that's really interesting. I guess how I've been thinking about it, but it's been the first couple of layers, are more about CapEx and an investment that companies are making. Whereas the third layer is really about monetization. And I think at least when we last talked, the monetization part of the story was the least clear part to us. But given how we've seen, yeah, the Amazons and Googles all ramp up CapEx spending even more, are we any closer to getting a better idea on how monetization is going to work, particularly within that third layer?

Pruksa: Yes, I think so. And, we can actually look at this from two perspective. The first perspective is really from, you know, the existing players that you've mentioned, the hyperscalers the Amazon and Google's. I think, a year on, what we have seen is that when you look at the latest, transcripts and the results, all of them talk to how the demand for AI remains pretty strong. AWS, which is, the cloud business for Amazon talked about, multi-billion dollar annualized return run rate business in AI, and they are growing at a very high digit percentage year on year. And actually, they would have grown faster if not for capacity constraints. So, I think demand there, remains pretty strong. And I think, when you look at, what Google, Microsoft or Meta is saying, it is basically singing the same tune in their with regard to the demand as well. So, I think AI adoption continues to be strong. A lot of this is led by cloud. But I think what we are seeing over the course of last year as well is that the demand for AI is also moving into the enterprises. The software companies, and our developed markets team, have talked a lot about this as well. Whether you talk about ServiceNow, that are seeing a lot of adoption of AI for the enterprise customers. And, they are spending a lot of CapEx, and seeing returns and growth to justify their level of investment. And we do expect this to continue to be the case. Given that, they have also just, announced their CapEx plan for 2025 and that remains to be, quite strong as well. The second perspective to think about is really from new players, and with how the cost of AI has been coming down. Just to give an example, Microsoft talked about, you know, a two times price performance gain for every hardware generation, more than ten times for every model generation due to software optimization. So, what this means is that AI is becoming a lot cheaper, more efficient, very fast. And we should actually see more players, coming into this space. Smaller players that have been, I think, finding it a bit too expensive in investing in large language models in the past, but now they are able to just participate in the game as well. And for them, if the cost of AI is coming down, then what this means is that your ability to participate and your monetization and the ROI actually goes up over time. So, we are perhaps at this very interesting stage of, at the start of this broadening out of AI adoption into the wider part of the economy, instead of being, more narrowly led in the past. 

Nick: Yeah, and certainly I think, you know, thinking about the third layer and how that's working out, you can see it in the US markets, I think, quite clearly with that, the rise of, Microsoft and ServiceNow and Palantir and those types of companies that that are really kind of jumping on the AI adoption story. It's interesting you mentioned the cost of AI because this is something that took markets completely by surprise a few weeks ago was the arrival of DeepSeek the Chinese LLM kind of reasoning model. I guess being China based this is very relevant for emerging markets. I mean, how do you think that has changed the picture and, you know, do you think we might see more of these third layer type companies come out of China now that they clearly are closer behind the US than anticipated in developments of these things? 

Pruksa: Yes, I think so. I think that's one of the implication of DeepSeek. But I do think, there are, quite a fair bit, to unravel there in terms of what we call the arrival of DeepSeek, which, certainly has kept our Chinese New Year very busy and spending time looking into this and understanding what this means. But I think to be clear, we certainly did not predict, that DeepSeek will happen. But I think what we did observe is that at this early stage of AI, there are bound to be disruptions. And, you know, within the technology industries, there will be search for ways to drive better cost and performance. So, if it's not DeepSeek, it's probably going to be someone else. And going forward, we should actually expect more breakthroughs to come. And that's why existing players will need to evolve, to redefine what it takes to stay competitive. We are already seeing the shift towards what you call a reasoning model today. DeepSeek is a reasoning model. This is very similar to, OpenAI or one as well. And what reasoning model means is that, the model is able to perform a more complex task by stimulating a human like thought process and thinking through a problem step by step before providing an answer. So, I don't know Nick if you have tried, asking ChatGPT using their reasoning model mode. I did, but it actually takes a longer time to think, it gives you the answer, but it also shows you how it gets the answer. So it was, quite a different experience when you go through the normal ChatGPT type and you get a sort of an instant answer that there is a bit of a time lag and it reflects the thinking part, of the model. But back to the point about disruption. We should expect this to be the norm. And, I think in terms of implication, we can look at this from both a half glass full or half glass empty scenario. If you look at what happened to the market, in terms of the global sell off exChina, you will see that, the market has subscribed to the half glass, empty scenario, which is that we will now be spending much less on AI, much less on compute, because it's cheaper and that, you know, the infrastructure story the layers that we talk about, is now over. We actually take a different view and we subscribe to what you call a half glass full scenario. And we saw that the development, of what has happened with talk about disruption, and the push for cost and performance is very much in the nature of the tech industry. So, you can take a look at Moore's Law, which has been, you know, going on and still is going on for over 50 years. Adoption within the tech space accelerates when costs come down. This happens at a pace of two times every 18 months for Moore's Law. But this is actually much faster for AI. So, you should expect actually the adoption of AI to be even faster and given, you know, again, we are at a very early stage, use cases, costs lower, enforcing costs allows a broader adoption. So, we actually think that the end market is going to be bigger, and therefore we are going to have, a bit more compute, in over the long term as well. 

Nick: What do you think the impact is on China and geopolitics of this big step ahead that China has made? I mean, much has been made of the kind of various semiconductor restrictions that China was subject to. Yet they developed this model, you know, very well and not very far behind the US. And then if things like the killer app of AI autonomous driving, you know, China seems to be doing very well on that as well compared with the US. So how do you think this is likely to influence the US's kind of, I suppose relationship and tensions with China. 

Pruksa: I actually do think that, this might mean that, you know, the restrictions may get a bit tougher and, relations will remain quite tense. And, the reason for this is that, you have to think of a scenario where, you know, despite all the export restrictions, where China is not able to get the most advanced GPU, out of the US. China, through DeepSeek, has actually demonstrated that they are able to innovate their way out of the constraints, despite the export control restrictions. So, what this means is that, restrictions will tighten. China will continue to actually, innovate. And I think the localization and go part, is, is coming up, as well where that is, improving pretty rapidly within the semiconductor space. We are now hearing about, local mid-chips, whether it's Huawei's Ascend chips, whether you talk about, or the other local chip designers. This is coming up and that will help China to continue to, you know, push through its AI development. With the pace that that we have seen. Whether you talked about the adoption of AI, and it's quite interesting that you brought up autonomous driving because just a couple of days ago, BYD has actually announced, that they are integrating, autonomous driving, on the highway, basically, for the cars. And a lot of this goes into the mass adoption in terms of the model as well. So, it no longer sits in the most high end models. One of our colleague has, just been on a test drive, on the highway for 57 minutes. So, she has shared her experience in her write up, but essentially she did say that, it was working pretty well. That was like 1 or 2 human intervention because the car crossed the line. But otherwise, I think, given that autonomous driving, you don't actually use the most super high advanced chip. You don't need a two nanometer or three nanometer at that stage. What China has, together with the software optimization, together with, you know, a lot of data that they have collected. We should actually see this picking up pretty fast in China and could be one of the use cases of AI going forward within this space as well. So, I think from an AI ecosystem perspective, you are very likely to see, an exChina ecosystem. And we can have a debate whether that's going to be, you know, better to have an open-source model or a closed source model. But within China, things are happening very rapidly. And, we haven't really talked about, what's happening in the large internet, companies that, have actually been, seeing the fund flows that have corrected from the US markets, into the China markets as well. And we can, go into that if, if that is of interest.

Nick: Yeah. I'd love to go into that. I mean, I suppose one of the things that's very obviously been very apparent in the US has been this, concentration of a lot of the benefits of AI into the mag seven and the hyperscalers, and I suppose, be interesting to get your views on that within China, given that you've just had Alibaba, for instance, announced this, partnership with Apple to bring AI onto the iPhone. And, and I suppose whether or not we might see similar thing evolving in China in terms of that concentration into the large internet and new economy companies. 

Pruksa: Yeah, sure. I mean, the large internet companies in China, whether you look at Alibaba, Two, Tencent, ByteDance, they are at the forefront of investing in large language model. And some of them are already starting to monetize. You know, back to that monetization question that we raised earlier and see returns from this, just a slightly different, use case. So just to give you an example, as early as mid last year when we visited Tencent, they were talking about using AI to lower its games content, development costs. So that you have actually, you know, you lower your engineer usage, you reduce your games content development costs and therefore you are able to develop games, at a lower cost and increasing your margins. They also talked about using, smarter advertising, targeting, through AI. And that actually helps to improve what you call a click through rate, that helps to improve over time its returns on its advertising business and increasing its margins. So, this is already coming through, as early as last year. And we do expect, you know this pace to continue. We also have seen the excitement around Alibaba, even before the cooperation with Apple. And that's really because Alibaba has a very big cloud business in China where the company has been, you know, investing in its, LLMs as mentioned. Its name of the LLM is called Qwen2.5, which actually, DeepSeek has needs upon to develop its model. But, I think until very recently, the market has not given much value to its cloud business. And that is one of the reasons why you see the strength in the share price. Now moving away from the large internet companies. China also has a very established, vertical players like, trip.com that again, when we spoke to the CEO Nick, you were in that meeting as well in London, and she was talking about how she has been adopting AI to lower that cost and improving the value that they bring to serving their customers. For those that don't know, Trip.com is like, Booking.com, and this is the China business and they are expanding outside of China as well. And what we have seen is that I think over the last two years, whether you look at Trip.com or the internet companies, the growth backdrop has not been too bright for China. And what this means is that improve automation and adoption AI have actually played a large part in driving productivity gains and improving their returns on investment. Finally, I think, we should not underestimate the innovation that can come out from China's, ability to self disrupt. China is a very competitive market with many players, and everyone has the incentive to try to win the race. So, even though there is a lot of people talking about DeepSeek, a couple of days ago, ByteDance also came out with, an announcement that they have a video generation model, very similar to Sora, and they are setting a new benchmark there as well. And, this video generation model is a text to video generation called Goku Plus. And apparently this can reduce, the video ad production costs by 99%, compared to hiring social media content creators. So again, I think things are just, happening so, so fast. Plenty of use cases in China. And we should, expect, disruption to be the norm.

Nick: Yeah. Great. This is a huge amount going on. I also read our colleagues note on, BYD. That's really interesting in terms of how mass market adoption of autonomous driving looks like. It's pretty close now in China, if these if these cars are rolled out. Just thinking about some of the other trends. I mean, what other things should we be looking out for now in the whole AI ecosystem? And, how do you think we should approach them in terms of investing? 

Pruksa: Sure. So, definitely many moving parts. And, I think some are nearer term, than others. And we are really excited to start to see how this will continue to evolve. But actually, I would just like to highlight a few things that we think are it's not going to be too far away. The first one is really about AI given that a conversation is about AI and we are really talking about how the intelligence in this case is really moving into action. And what we mean by action is action in a few areas. The first area, is in the area of humanoid robots. So, it's really about, you know, the combination of, the AI software into the robotic and automation space. Again, this caused a lot of excitement, within the robotic supply chain that supplies to Tesla, for example. But, we can imagine that, you know, as the costs of adoption comes down and performance improve, Tesla wouldn't be the only one that will be adopting this. And we should see this, being very much, part of the, a factory's move to improve automation, improving productivity in places where, you are going to face labour costs, challenges or aging population, like China. So humanoid robots, it's one, area to, to look into. And, it is happening as we speak. So the supply chain there, is across Asia. Some are that are in China. And it could be, something as precise as, you know, gear, robotic gear, production company, for example. Another area, within the action space is something called, Digital Factory. This is not really that new, but we are seeing again, quite an increase in adoption of this where, you can imagine this to be part of Nvidia's Omniverse strategy. Earlier on as well. Again, just to share a bit of an anecdote, I was in, city in China last year, mid last year again, we went to visit, China's largest, battery, manufacturer and we were visiting that new super line. And what they were showing us is that alongside the robotic arms, that they were doing, there is a digital twin version of it, operating alongside that. And they are currently collecting all the data and really simulating what's going on there. So, I think, even as we speak, this notion of digital twin, digital factory, and the adoption of AI, is going to be, pretty real there as well. The third area in the action point is something called AI agents. This is something that, OpenAI has started to talk about this year with the launch of, OpenAI Operator. If you look at, what Microsoft or Meta has been saying, a lot have been talking about, AI agents as well. So simply put, AI agent is like your AI assistant. So, it's really about looking forward to the day that AI can help you do all the mundane tasks of, you know, grocery shopping online. Perhaps, in the case of Salesforce, that one of our developed markets colleagues has been talking to me about where you can get, you know, your AI assistant to pull up all the relevant data about a client before you go and see your clients. Very easily done. And that is already available. Meta talks goes a bit further and talks about, AI engineering agent that has coding, problem solving abilities of around a good mid-level engineer this year as well that, they are planning to build. So, I think that could be pretty disruptive. But it also represents, I think, a very large opportunity for, you know, the Indian IT outsourcing companies that, we are invested in, given that the bulk of their costs are actually engineer costs as well. And that frees up their time to do more, higher value added services. So, I think, when you think about companies that involve in the supply chain, these are the areas that are going to benefit. In other areas, I'll just touch on one more trend. I know I've got also quite a bit of time, it's really the growing trend of custom silicon. And what this means is that, in the past, you know, companies may be using a lot of GPUs from Nvidia, but over time we are seeing, the adoption of a self-designed, chip, which is, custom silicon. And I think, it us has been doing this and part of that has been benefiting the supply chain within Taiwan as well. And the reason why someone would do this, again, to quote AWS, the Trainium two is the name of that chip, and this achieves about 30 to 40% better price performance versus the current solution that they have. So again, I think the notion of, you know, price, performance, cost performance trade off and the ability to drive efficiency, doesn't change here. The implication of this, we would have to think about, you know, whether a certain supply chain and companies are more leverage to one ecosystem than the other. But if you were to think about where they would all be manufacturing their chips, that I've seen, quite a few reports there that they are thinking about doing it at three nanometer, and it's probably all going to be, largely done at TSMC, which positions itself as everybody's foundry. So, to TSMC perspective, it's pretty agnostic there. And finally, I think the last point is really about how we should begin back to the disruption is the non-point. We should expect more gen AI apps and AI models to come up. And that is to cater to different use cases, different workloads and lower costs to drive more application usage and ultimately a larger compute opportunity size.

Nick: Great. And then so in terms of the, I guess, investment opportunities associated with those trends, anything you could, you could draw out. 

Pruksa: Yeah. One thing that, I forgot to mention in terms of investment opportunities specifically, again, back to the three layers of AI ecosystem. I think the foundational model size is, a bit difficult, to find, opportunities there, within Asia, we may see, some of the opportunities in the China Internet names. But we will put that into the application and software bucket instead. The semiconductor and infrastructure side of things. And the enablers that we talked about, I think the longer term pictures have not changed. There might be a bit of a risk in terms of air pockets in 2026. That something remains to be seen. But if our underlying belief is that, compute will grow bigger as a result of lower, AI costs, then the longer term picture that, doesn't change then is a matter of how much you pay for it and the pricing versus the opportunity size. The third area is the application and software layer that I've talked about, and that how, you know, for this year, this is the layer that is becoming a lot more interesting from a risk and reward perspective, but also because they have just come off, a bit of a cyclical low as well. From an earnings perspective, you have plenty of that in China. You have plenty of debt in, in the Indian IT, outsourcing companies as well. And you also have one last bit, which is the AI supply chain that I have not talked about. But the idea behind that is that, as you know, the cost of processing AI becomes cheaper. You are able and you can afford to do it at your own device level. So whether it's the smartphone or your AI PC, you will actually see a faster replacement cycle for that. And that has also been, pretty long delayed. And that will have to drive better an upgrade of, of your devices. So, so that's the device side of things as well. So, plenty of opportunities there. 

Nick: Well, brilliant. I mean, that feels like a very comprehensive summary of where we need to look for investment. And given the amount of disruption going on, it really feels like we need to double down a bit in terms of going out there and finding companies that are going to benefit. So, I think that's probably a good place to leave it. Let us go and find those companies. So, the only thing to do now is just thank you, Pruksa, for joining. Thanks so much. That was really great today. 

Pruksa: Thanks, Nick. My pleasure. 

Nick: And thanks also to everyone who took the time today to listen in. If you enjoyed today, then please download our other podcasts from our website or wherever you normally get your podcasts. Watch out for the next episode and tune in.