Continue reading this on our app for a better experience

Open in App
Floating Button
Home News Artificial Intelligence

OpenAI has arrived in Singapore – are enterprises taking the plunge?

Nicole Lim
Nicole Lim • 9 min read
OpenAI has arrived in Singapore – are enterprises taking the plunge?
Many financial institutions have turned to open source models to develop internal genAI tools. Will there be demand for OpenAI’s use cases? Photo: Bloomberg
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

Singapore was one of the first countries to announce its ambition to become an AI hub in 2019, two years before generative AI (Gen AI) took the world by storm. When the company behind ChatGPT chose the city-state for its second office in Asia, it signalled that the nation had cemented its place on the global stage.

"Singapore, with its rich history of technology leadership, has emerged as a leader in artificial intelligence,” said Sam Altman, CEO of OpenAI, in a statement following the firm’s expansion into the country. “We’re excited to partner with the government and the country’s thriving AI ecosystem as we expand into the Asia Pacific region.”

OpenAI reports that Singaporeans are among the highest per capita users of ChatGPT globally, accounting for the largest proportion of paid subscribers. The company adds that nearly one in four Singaporeans use ChatGPT every week.

The AI firm, until recently a leader in Gen AI, announced it would allocate up to US$1 million (S$1.37 million) to develop resources, including open datasets, aimed at ensuring AI models are better adapted to the diverse languages and cultures of Southeast Asia. This will be done through a partnership with AI Singapore.

OpenAI’s government collaborations include GovTech’s Build for Good, while its enterprise partnerships in the city-state extend to ride-hailing platform Grab. Other notable partners include software company Canva, Japan’s biggest lender MUFG and online retailer Rakuten.

OpenAI’s most popular product is its chatbot, ChatGPT. Since becoming a for-profit entity, the firm has released multiple versions of this core product, each tailored to different user needs. Its two latest flagship models, GPT-4o and o1, also have smaller variants — GPT-4o Mini and o1 Mini — designed for specialised tasks. o1 is the company’s reasoning model. On Jan 20, it saw competition from the Chinese challenger DeepSeek, whose R1 model has achieved comparable performance.

See also: Nvidia chips, Trump's tariffs and AI's future

In its partnership with OpenAI, Grab said it “turned to OpenAI’s GPT4o with vision fine-tuning to overcome its mapping obstacles over Southeast Asia.” Using its network of motorbike drivers and pedestrian partners, each equipped with 360-degree cameras, GrabMaps collected millions of street-level images to train and fine-tune models for detailed mapmaking.

OCBC’s AI innovation 
GPT-4o’s vision fine-tuning has enabled GrabMaps to more accurately localise speed limit signs, turn restrictions, locations and road geometries. The Gen AI company has also partnered with Morgan Stanley to develop AI solutions that provide financial advisors with quicker insights, more informed decisions and efficient summarisation tools to strengthen client relationships. Morgan Stanley’s Wealth Management division has integrated GPT-4 into its workflows, with over 98% of its advisor teams actively using AI @ Morgan Stanley Assistant — the internal chatbot for financial advisors.

In late 2023, a year after ChatGPT’s release, OCBC launched OCBC GPT, a natural language processing tool designed for internal use by employees to assist with writing, research and ideation. The bank claims this is the only Gen AI application they have that leverages an external model.

See also: Foxconn’s mega-AI plant ready in a year despite Trump tariffs

Instead, the bank has a division called the Group Data Office, headed by Donald MacDonald, that specialises in advanced analytics and AI. The executive says that data scientists within the division do all AI development and work almost exclusively with open-source models such as Llama 3.2 and Mixtral. “Basically, we’ve developed what we call a large language model sandbox, and new open-source models are coming out every week,” adds MacDonald. 

The bank has a two-pronged approach to Gen AI — through a suite of “universal applications” or horizontal tools available to everyone across the organisation, which includes OCBC GPT. It also has an internal Gen AI knowledge assistant called OCBC Buddy, which allows employees to search through the over 400,000 documents across the bank’s internet. 

The second approach is more vertically focused. MacDonald himself works with specific teams to build role-specific copilots, which are Gen AI tools deeply focused on each team’s work. One such tool is OCBC Wingman, which is designed to help the IT team develop software more quickly. OCBC says it has 30 different Gen AI use cases today, and employees use them 500,000 times a month.

There are three key reasons why the bank uses open source models instead of closed — innovation, sensitivity and cost. Innovation in the Gen AI world is largely driven by the open source world, which is now able to “very quickly catch up” to the cloud giants, says MacDonald. “They’re definitely good enough for the use cases that we’re doing.” 

Moreover, many of the bank’s use cases involve sensitive information that cannot be transmitted beyond the firewall — an issue frequently raised by financial institutions in their efforts to adopt Gen AI.

Finally, using open source models is more cost-effective. “If you’re using external models and you’re paying by application programming interfaces (API) call, when you have the sheer scale of models that we have, those costs would rapidly go into the millions of dollars. But by doing it on-premise — we already own our own GPU infrastructure — it allows us to run these things very, very cost-effectively,” says MacDonald. The only downside, he adds, is that models built by cloud giants will always be “slightly ahead,” though open-source models can catch up within a few months.

Pay-per-use or own? 
Gen AI APIs are billed based on the volume of input they receive and the volume of output they generate. Different providers measure these volumes differently, with some measuring tokens and others measuring characters, but the concepts remain the same.

To stay ahead of Singapore and the region’s corporate and economic trends, click here for Latest Section

According to OpenAI’s pricing document, a token is a unit of measure representing around four characters, so the word “hello” would be slightly more than one token. The company charges US$5 per million input tokens and US$20 per million output tokens for its real-time API. In comparison, Chinese competitor DeepSeek charges 14 US cents per million input tokens and US$2.19 per million output tokens for its DeepSeek-Reasoner model.

OCBC’s MacDonald explains that some organisations experimenting with Gen AI on the cloud would have to calculate the business case and justify the spend per API, but “that is not really a challenge for us” as the bank already has the existing infrastructure. 

If the bank chooses to experiment with Gen AI externally using APIs, it would have to pay for every API call; some use cases, such as transcribing and categorising customer calls, might cost billions of dollars. 

“So for us, Gen AI wasn’t a lot of additional investment. It was leveraging stuff that we’d already put in place. We did have to upsize our graphics processing unit (GPU) infrastructure. We needed more (infrastructure) to support some of the use cases that we’re executing; that was really [all] the additional cost,” he says. “The incremental cost to us is very, very low”. 

Similarly, DBS reports that in 2023, it generated over 240 Gen AI ideas, 20 of which are currently being implemented. In total, the bank has more than 1,500 AI models across over 370 use cases, with an estimated annual economic impact of around $780 million. Following a consecutive doubling in recent years, this is expected to exceed $1 billion.

Like OCBC, DBS deployed DBS-GPT, a programme inspired by OpenAI’s ChatGPT. They developed it to help employees generate content and write tasks in a secure environment. Another key initiative is their in-house Customer Service Officer (CSO) Assistant, which transcribes customers’ queries and retrieves information from the bank’s database in real time. CSO Assistant also helps with post-call documentation by providing instant call summaries and pre-filling service request fields. 

Back office to the frontlines 
Today, Gen AI is extensively used within financial institutions to improve internal workflows. However, OCBC’s MacDonald does not think many banks today are actually making Gen AI directly available to an end customer. For instance, Insurance giant Prudential launched its global AI lab in Singapore in November 2024 to “accelerate the adoption of AI and machine learning in the organisation.” The lab focuses on the healthcare, wealth, insurance distribution and operations sectors. 

The firm is also tapping on selected open source and proprietary large language models (LLMs) and the latest research in collaboration with their ecosystem, although it does not specify who. One of the use cases that the Prudential AI Lab is working on leverages Med-PaLM 2, an LLM designed to provide “high quality” answers to medical questions. This is used in medical claims processing to enable Prudential’s agents to handle claims more efficiently. The firm said that several proof-of-concept tests showed that using the LLMs significantly improved the straight-through processing (STP) rate by threefold. 

“My gut feeling is that we’re going to see Gen AI touch the customer directly on the banking side within the coming year. Here in OCBC, we already have the routines that can identify when our models are hallucinating and fix the hallucinations before we give answers to employees,” says MacDonald. 

A trend is emerging among financial institutions: the use of agentic systems, a form of AI that autonomously makes decisions, takes actions and self-optimises in real time. Prudential is piloting these systems to offer concierge-like services for tasks such as coordination and appointment scheduling, while OCBC is using them to break down complex tasks into more manageable components.

When asked about paying a premium for Gen AI products, Prudential says it is committed to building an ecosystem of partners to co-develop advanced AI solutions. This includes collaborations with higher education institutions, research centres, government agencies, technology partners and AI start-ups. The company currently partners with Google Cloud to provide its employees with access to cutting-edge AI solutions.

OCBC’s MacDonald says that as more “flagship” companies like OpenAI establish a presence in Singapore, awareness of AI will grow, fostering talent and attracting investment, ultimately strengthening the AI ecosystem. 

×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2025 The Edge Publishing Pte Ltd. All rights reserved.