Should I read this?

This is a case study from an R&D team of a large enterprise using a large language model (LLM) from their users’ mobile, web and thick client apps, without exposing any data or metadata to any network.  Specifically, an Azure OpenAI powered chatbot with APIs to iPhone (iOS) mobile apps, without their Azure VNet being open to any inbound data (so the firewall set to deny-all inbound).  Consider it a sneak preview – the full case study will be published next.

Our customer’s requirements went beyond security.  They included:

  1. No VPNs, whitelisted IPs or MPLS dependencies.
  2. No touching the (B2B and B2B2C) mobile phones.
  3. No backhauling all the app traffic – different microservices and APIs to different destinations, directly from the mobile app.

Spoiler alert: all requirements met, with an open source based, software only solution…with the proof of concept (POC) done in one day.

If you have requirements like this, then this case study should be worth reading.  Otherwise, this post will be boring.

The AI dilemma

The astounding progress of Large Language Models (LLMs) is creating two opposing forces:

  1. We want to leverage the Artificial Intelligence (AI) capabilities in our apps, querying public LLMs (e.g. ChatGPT, Azure OpenAI) and self-hosted LLMs (e.g. Llama, Claude).
  2. We need privacy and security. The data and prompts can contain sensitive, competitive or valuable information, including customer information.

We have seen this movie before.  In fact, the security force is why our internal enterprise apps are only available via private networks such as our WANs or VPNs.  By shielding our internal enterprise apps, APIs and data from the Internet, we minimize the attack surface.  Unfortunately, many of apps which we want to use AI can’t be forced on to our WAN or VPN (and their connections to the cloud such as ExpressRoute or Direct Connect), or doing so would add too much latency, failure points or user experience problems.

Solving the AI dilemma

So, we need a solution to enable our apps and APIs to access the LLMs, but without exposing any of the data to the Internet, and without needing to navigate MPLS or VPN.  Actually, this is exactly what NetFoundry’s OpenZiti (open source) and CloudZiti (SaaS)  already do for billions of sessions per year – the LLM on the server side doesn’t make a difference to Ziti.  Ziti enables us to:

  1. Minimize the threat surface – prevent attacks from the Internet.
  2. Secure the road out too – help prevent data exfiltration.
  3. Provide great UX – no VPN or MPLS backhaul.
  4. Simplify operations – no complex ACLs, WAF configurations, DNS dependencies or other day two, bolted-on, infrastructure dependent, cloud-specific security add-ons.

In other words, the Ziti platform enables us to get both security and simplicity.  In this case, use LLMs without network exposure and without VPN.  It is the same Ziti platform which leaders such as Microsoft, LiveView and Intrusion use to securely deliver billions of sessions per year for use cases such as:

Note: you can skip this post and try the solution yourself – it is all software so you can spin up a free sandbox in minutes via CloudZiti, or dive right into the open source.

iPhone app to Azure AI: requirements

Our customer had these basic requirements:

  1. Develop an Azure AI (private version of ChatGPT-4) based chatbot.
  2. Incorporate the chatbot into their current mobile application, published as an iOS (iPhone) app.
  3. Ensure a private, secure connection between the iPhone app and Azure AI. Azure enables private networking within Azure between the customer VNet and Azure AI.  However, in the “shared responsibility model” which Azure uses (and all the cloud providers use), the connection to Azure is the responsibility of our customer to secure.

Azure does enable their customers to provision private MPLS circuits from customer data centers to Azure, via meet me sites.  However, in this case:

  1. Backhauling the mobile app sessions to the WAN and then the data center and then Azure would add latency and failure points to every application session.
  2. Adding VPN clients to all the mobile phones (most of which are not controlled by our customer) was not viable. The mobile app leverages multiple environments – only some of the APIs need to go to the Azure chatbot – making VPN backhaul even more unattractive.
  3. The mobile app is already deployed at scale. The cloud MPLS solutions (ExpressRoute, Direct Connect, etc.) are complex, creating high operational costs at scale.  For the privacy of our customer, here is a public example of what it looks like (using AWS as the example:

iPhone app to Azure AI: solution

Here is what our customer did.  It turned out to be relatively simple, because it is an all software solution.  In fact, the proof of concept was successful in one day:

  • On the Azure side, our customer used the Azure reference architecture, with a private Ziti router (from the Azure marketplace) in their VNet. Our customer then changed their firewall to deny all inbound traffic (the Ziti router opens outbound sessions only; yet still handles traffic originated from either the server or the client).
  • On the mobile side, they used Ziti’s Swift SDK in their app. This code results in strong identification, authentication and authorization, with delivery across their private Ziti overlay network, including mutual TLS (mTLS) and encryption.
  • The private Ziti overlay network in the middle is hosted by NetFoundry as part of the CloudZiti SaaS. Our customer could have self-hosted with OpenZiti (open source).

Related case studies and use cases

Enabling mobile and web apps to leverage AI in a simple and secure manner is hot off the press.  We’ll share more details when the case studies are fully public.  In the meantime, here are similar use cases:

Private API gateway access for distributed endpoints

Here is a case study of consuming a MuleSoft API in the Oracle Cloud from a Digital Ocean cloud.  The result is direct connections over a private overlay with private IP addresses, and without any dependencies on MPLS or VPN.  Here are some other examples: Kubernetes (Oracle); Multicloud (IBM); Hybrid cloud (CERM); Kubernetes (SPS)

Securing access to Kubernetes – taking the Kubectl API off the Internet

Here is how Ozone used Ziti to manage their customers’ Kubernetes environments without requiring any inbound access:

Similarly, by adding the Ziti code to APIs like the Kubernetes API (see example here), K8s APIs are made unreachable.  Here are some more API and multicloud examples: Kubernetes (Oracle); Multicloud (IBM); Hybrid cloud (CERM); Kubernetes (SPS)

Summary

We can now use Artificial Intelligence (AI) capabilities in our web and mobile apps, querying public LLMs (e.g. ChatGPT, Azure OpenAI) and self-hosted LLMs (e.g. Llama, Claude), and yet maintain security and privacy.  The customer example above is specific to an iPhone iOS app querying an OpenAI powered chatbot, but the same architecture works for other LLMs and other clouds.  You can start today:

  • CloudZiti (hosted SaaS), free for up to 10 endpoints…be up and running in minutes
  • OpenZiti open source zero trust networking platform
  • Instantly schedule a demo or briefing.
Discuss On: