Unlocking the Power of AI: Insights and Experiences with Azure and Power Platform
Author: Luis Felipe López
Are you an IT professional or a business leader overwhelmed by the constant buzz around AI? Are you struggling to understand what it all means for your business and how you can leverage it for real-world benefits? As the person responsible for the AI agenda in my organization, I've been navigating these questions, and I'm here to share my insights and experiences. In this post, I will demystify AI for you, focusing on its implementation in Azure and Power Platform, and provide practical recommendations to help you navigate your own AI journey.
The noise, no doubt, can be overwhelming and ultimately lead to inaction. Lessons drawn from historical events like the introduction of the internet, where multiple business models were disrupted, have companies rushing to implement "AI solutions". Earning calls and board meetings agendas have been hijacked by the AI mania. And although these statements are well intended, they seemed lack a real understanding of what this technology is, its limitations and opportunities, not even from a technical but conceptual point of view.
As responsible for the AI agenda in my current organization and an enthusiast of this technology, I get the opportunity to talk to several colleagues in business areas, IT and other organizations with which we collab. I'm constantly amazed by users and Product Owners that just want "AI" (yes, just AI), as if it was a pandora box capable of producing magical results. More surprisingly, practitioners are edging towards extremes: those that believe AI solution is just a thin layer over (Azure) OpenAI which value will go away once the hype subside, and those under the impression that every problem requires throwing AI at it.
In this post, I present my learnings over the last months working with AI-Infused solutions. From this experience, I outline what I think it takes to succeed implementing these kinds of solutions and offer some recommendations on how to get started.
AI, which AI?
Although "AI" has been a buzzword for more than a decade, the introduction of ChatGPT made tangible to the masses what a system of intelligence is capable of doing. This has led to the propagation of "AI FOMO" (fear of missing out) among Senior Executives, Products Managers and IT organizations in almost every industry. But what kind of AI are they talking about?
What we now call "Traditional AI" refers to algorithms design to carry out particular tasks, relying on a predetermined range of inputs. These algorithms learn from historical data and make predictions about the specific task they were trained for based on incoming data of the same nature as its training set. An example of this is a fraud detection model in banking, trained to identify unusual patterns or activities in users' accounts, which might indicate fraudulent transactions.
Modern forms of AI, such as Generative AI, are capable of producing new content in various forms like text, images, and other media. These systems are often powered by Large Language Models, which are designed to create useful and simplified digital representations of language. These language models form a subset of what is known as 'foundation models', a broader category of AI systems that provide a wide range of capabilities and can be adapted to fulfill more specific tasks or purposes.
The distinction between these forms of intelligence is important because it serves as a guide to align the AI technology with the unique needs, goals, and resources of the organization. This understanding forms the foundation for strategic decision-making, enabling businesses to choose the right AI form that aligns with their specific use-cases, whether it's automating processes, generating new content, predicting trends or optimizing operations.
I'm particularly bullish on the modern form of AI, especially in the context of traditional companies. Historically, the lack of human capital trained on technology domains and the prohibiting cost related to AI meant they were mere spectators of technological progress. Foundational models, usally consume as an API call, are changing this dynamic and opening the door to more organizations to reap the benefits of the AI revolution.
The Enablers of your AI journey
Ten months ago, my AI knowledge was mostly academic and anecdotal. This past year, I've actively participated in my company's AI journey, with a specific focus on what I previously called modern forms of AI, like Generative AI. While it is still early days, I've noticed patterns and decisions that have boosted our AI-Infused products and agenda. I have broken them down in to two categories: organizational enablers and technical enablers; both facilitate or encourage the successful implementation and use of AI.
Organizational Enablers
Organizational enablers refer to a set of incentives, company culture, leadership support, innovation mindset and clear conceptual understanding of what AI is and what is capable of. In my experience, 4 enablers stand out:
- Identify and nurture evangelists: these individuals are typically passionate about tech, understand how AI can enhance company processes, and have a strong drive to implement change. Their enthusiasm can be infectious, spreading interest and understanding throughout the organization. Effective leaders should aim to identify these evangelists and empower them to lead the AI agenda. If the company is fortunate, these individuals will step forward on their own. Their active involvement can mean the difference between kickstarting meaningful AI initiatives or ending up with flashy presentations and unfulfilled intentions.
- Identify quick but significant wins: AI enthusiasm can lead to overly ambitious projects that take time and lack immediate impact. For example, I was initially tasked with developing an end-to-end service desk bot. While such bold ideas hold long-term potential, gaining momentum requires quicker, impactful wins. These are not just low-hanging fruits, but valuable initiatives that can be launched quickly and offer immediate value. In our case, these included a ChatGPT-like app for safe employee interactions with GPT models and a translation app that retains original document formatting.
- Fearless and purposeful experimentation: the uncertainty around AI, particularly modern forms, can make companies hesitant. However, this very uncertainty should inspire experimentation, fostering a deeper understanding of the technology and its implications. Experimentation helps develop and document internal capabilities, preparing the organization for a future where Large Language Models (LLMs) are more advanced, mature, and possibly regulated. For us, experimentation primarily has involved creating quick prototypes based on users' ideas, utilizing LangChain for orchestrating the development of more complex AI applications and exploring Open-Source models in Azure AI Studio.
- Engage end users: teams can sometimes overlook the importance of involving end users in the development of AI-Infused products. These users, familiar with daily company processes and challenges, can offer invaluable insights. Make time to listen to their issues and maintain communication post-product launch. This allows users to shape future updates, ensuring your product remains relevant and actively used – the ultimate goal.
Technical Enablers
Technical enablers involve the right configurations, selecting appropriate tools, upskilling your team, and cultivating a mindset that encourages constant innovation and experimentation. In our journey, we've chosen Azure and Power Platform as our go-to platforms for AI delivery. Here are some recommendations and learnings to help you get started:
- Leveraging Azure for AI Capabilities: We chose Azure for its comprehensive AI capabilities. Back in April 2023, Azure was one of the few platforms offering access to OpenAI's GPT models without compromising proprietary data. At the time of writing this content, access to these models still require companies to apply here. Once approve, you can provision a new resource (Azure OpenAI), which will create an endpoint you can use in your applications.
Since then, Azure has introduced several features that make it an attractive host for AI workloads. Azure AI - Machine Learning Studio, for instance, allows you to deploy various proprietary and open-source models, orchestrate end-to-end development of LLM infused applications, and monitor AI models to detect any LLM drift. These features, available on a single platform, eliminate the need for point solutions and simplify the development process, accelerating time to market.
It is important to be aware that currently the models and deployments in Azure are unstable and its usage for production scenarios should be considered carefully. Also, that the pace the AI landscape is currently changing is so fast that if you have some AI products you will soon realize the importance of AI/LLMOps, to ensure you are always on top of your deployments and recent functionalities added to the ecosystem you are using (Azure in our case).
- Embracing Open Source for AI Development: An open-source strategy is essential in any AI journey. With new models being deployed and fine-tuned regularly and made available to the broader community (often via Hugging Face, a startup that creates open-source AI and ML libraries), AI becomes more accessible. Companies can select and deploy these models in their own infrastructure, reducing reliance on proprietary models.
The open-source nature allows developers to customize and extend these models, tailoring the code to meet specific requirements, adapting it to different domains, and integrating it with other tools and systems.
Our early experimentation with open-source models includes prototypes with Whisper, an automatic speech recognition (ASR) system we plan to use for voice-to-text and text-to-voice experiences. We've also deployed models like Llama-2, which we intend to use in conjunction with Azure OpenAI to provide more options for our users.
- Power Platform: models and orchestration capabilities are the foundation of AI-Infused solutions, but you still need to build software to deliver experiences to the end users. Rapid prototyping and iteration are crucial in the fast-paced world of AI. The low-code nature of Power Platform allows organizations to quickly turn ideas into prototypes, sometimes in just a few hours. It's a game-changer for constant experimentation and user validation, which are key in assessing a solution's viability.
Moreover, Power Platform democratizes AI by enabling specialist teams to create shareable components like custom connectors or UI functionalities. This empowers business units, who are closer to their own challenges and have a deep understanding of their processes, to prototype solutions without having to wait for support from central teams.
In our AI journey, we've used Power Platform to deliver almost all of our applications. This has enabled us to deploy new apps and features quickly, receiving constant feedback from users. We've launched five AI applications, two of which have shown significant value based on user adoption. Furthermore, we've deployed custom connectors that allow colleagues to build their own applications on top of Azure OpenAI GPT-4.
Conclusions
AI has virtually captured all conversations in the tech world and beyond. However, these dialogues often miss a fundamental understanding of the specific AI form under discussion and its desired capabilities. As you embark on your AI journey, comprehending the type of AI your project or company needs should be a top priority.
My experience in implementing an AI agenda and developing AI products has taught me the importance of certain enablers that can streamline and accelerate the process. Identifying and nurturing evangelists who drive the AI agenda is crucial for turning aspirations into action. Also, it's essential to balance your ambitions, focusing on smaller, impactful solutions that deliver business value and create momentum.
Alongside these organizational enablers, a strong technical foundation is vital. The first logical step is choosing the right cloud provider and resources. Determining the optimal delivery model is also crucial. In our case, Power Platform has been a key enabler, allowing us to rapidly prototype new ideas at a low marginal cost.
As we continue to navigate the exciting world of AI, we'd love to hear about your experiences. What has worked for you? What challenges have you encountered? Sharing our learnings helps us all grow. So, let's continue the conversation in the comments below!
Comments ()