Original author: Advait (Leo) Jayant
A wants highly personalized recommendations on Netflix and Amazon. B does not want Netflix or Amazon to know their preferences.
In today's digital age, we enjoy the convenience of personalized recommendations from services such as Amazon and Netflix, which cater precisely to our tastes. However, the penetration of these platforms into our private lives is causing increasing unease. We crave customized services without sacrificing privacy. In the past, this seemed to be a paradox: how to achieve personalization without sharing large amounts of personal data with cloud-based artificial intelligence systems. Fully homomorphic encryption (FHE) provides a solution that allows us to have the best of both worlds.
Artificial Intelligence as a Service (AIaaS)
Artificial Intelligence (AI) now plays a key role in addressing complex challenges in multiple fields including computer vision, natural language processing (NLP), and recommender systems. However, the development of these AI models poses significant challenges to the average user:
1. Data volume: Building accurate models often requires huge datasets, sometimes even reaching the scale of petabytes.
2. Computational power: Complex models like Transformers require the powerful computing power of dozens of GPUs, often running continuously for weeks.
3. Domain expertise: Fine-tuning these models requires deep domain expertise.
These obstacles make it difficult for most users to develop powerful machine learning models independently.
AI as a Service Pipeline in Real Applications
Enter the era of AI as a Service (AIaaS), which overcomes these barriers by providing cloud services managed by technology giants (including FAANG members) to give users access to state-of-the-art neural network models. Users simply upload their raw data to these platforms, where it is processed to generate insightful inferences. AIaaS effectively democratizes access to high-quality machine learning models and opens up advanced AI tools to a wider audience. Unfortunately, today's AIaaS brings these conveniences at the expense of our privacy.
Data Privacy in AI as a Service
Currently, data is encrypted only during transmission from the client to the server. The server has access to the input data and the predictions made based on that data.
In the AI as a Service process, the server has access to both input and output data. This situation complicates the sharing of sensitive information, such as medical and financial data, by ordinary users. Regulations such as GDPR and CCPA exacerbate these concerns because they require users to explicitly consent before their data is shared and guarantee users the right to understand how their data is used. GDPR also further mandates encryption and protection of data during transmission. These regulations set strict standards to ensure user privacy and rights, advocating clear transparency and control over personal information. Given these requirements, we must develop strong privacy mechanisms in AI as a Service (AIaaS) processes to maintain trust and compliance.
FHE solves the problem
By encrypting a and b, we can ensure that the input data remains private.
Fully homomorphic encryption (FHE) provides a solution to the data privacy problems associated with cloud computing. The FHE scheme supports operations such as ciphertext addition and multiplication. The concept is simple: the sum of two encrypted values is equal to the encrypted result of the sum of the two values, and the same is true for multiplication.
In practice, it works as follows: the user performs an addition operation on the plaintext values ? and ? locally. The user then encrypts ? and ? and sends the ciphertext to the cloud server. The server can perform the addition operation on the encrypted values (homomorphically) and return the result. The result decrypted from the server will be the same as the local plaintext addition result of ? and ?. This process ensures data privacy while allowing computation to be performed in the cloud.
Deep Neural Network (DNN) Based on Fully Homomorphic Encryption
In addition to basic addition and multiplication operations, the technology of neural network processing using fully homomorphic encryption (FHE) has made significant progress in the AI as a service process. In this context, users can encrypt the original input data into ciphertext and transmit only these encrypted data to the cloud server. The server then performs homomorphic calculations on these ciphertexts, generates encrypted outputs, and returns them to the user. The key is that only the user holds the private key, enabling it to decrypt and access the results. This builds an end-to-end FHE encrypted data flow, ensuring the privacy of user data throughout the process.
Neural networks based on fully homomorphic encryption provide users with significant flexibility in AI as a service. Once the ciphertext is sent to the server, the user can go offline because there is no need for frequent communication between the client and the server. This property is particularly beneficial for IoT devices, which often operate under constraints and where frequent communication is often impractical.
However, it is worth noting the limitations of fully homomorphic encryption (FHE). Its computational overhead is huge; FHE schemes are inherently time-consuming, complex, and resource-intensive. In addition, FHE currently has difficulty effectively supporting nonlinear operations, which poses a challenge for the implementation of neural networks. This limitation may affect the accuracy of neural networks built on FHE, as nonlinear operations are critical to the performance of such models.
K.-Y. Lam, X. Lu, L. Zhang, X. Wang, H. Wang, and S. Q. Goh, "Privacy-Enhanced Neural Networks Based on Efficient Fully Homomorphic Encryption for Applications in AI as a Service", published at Nanyang Technological University (Singapore) and Chinese Academy of Sciences (China).
(Lam et al., 2024) describes a privacy-enhanced neural network protocol for AI as a service. The protocol first defines the parameters of the input layer by using learning with errors (LWE). LWE is a cryptographic primitive used to protect data by encrypting it so that computations can be performed on the encrypted data without first decrypting it. For the hidden output layer, the parameters are defined using ring LWE (RLWE) and ring GSW (RGSW), two advanced cryptographic techniques that extend LWE to achieve more efficient cryptographic operations.
The common parameters include the decomposition basis ? and ???Given an input vector ? of length ?, a set of ? LWE ciphertexts (??,??, the evaluation key for ?[?] is generated for each element ?) using the LWE private key ?, and the evaluation key for ?[?]>0 is generated for index ?[?]>1 and ? stretchy="false">]<0In addition, a set of LWE switching keys are set for ?. These keys enable efficient switching between different encryption schemes.
The input layer is designated as layer 0 and the output layer as layer ?For each layer ? the number of neurons from 1 to ? is ?? is determined in layer 0. The weight matrix ?? and the bias vector ?? are defined starting from layer 0 and superimposed on layer 0. For each neuron ℎ from 0 to ??−1 the LWE ciphertext from ?−1 layer is evaluated under homomorphic encryption. This means that computation is performed on the encrypted data to compute the linear function in . The -th neuron in the .-th layer, combined with the weight matrix and bias vector. Subsequently, the lookup table (LUT) is evaluated in the .-th neuron, and the switch from . to the smaller . After the operation is performed, the result is then rounded and rescaled. This result is included in the set of LWE ciphertexts in the .-th layer.
Finally, the protocol returns the LWE ciphertext to the user. The user can then decrypt all ciphertexts using the private key ?. Find the inference result.
This protocol efficiently implements privacy-preserving neural network inference by leveraging fully homomorphic encryption (FHE). FHE allows computation on encrypted data without leaking the data itself to the processing server, ensuring data privacy while providing the benefits of AI as a service.
Applications of Fully Homomorphic Encryption in AI
FHE (Fully Homomorphic Encryption) makes secure computation on encrypted data possible, opening up many new application scenarios while ensuring data privacy and security.
Consumer Privacy in Advertising: (Armknecht et al., 2013) proposed an innovative recommendation system that leverages fully homomorphic encryption (FHE). This system is able to provide personalized recommendations to users while ensuring that the recommendations are completely confidential to the system itself. This ensures the privacy of user preference information, effectively solving a major privacy issue in targeted advertising.
Medical Applications: (Naehrig et al., 2011) proposed a compelling solution for the healthcare industry. They proposed using fully homomorphic encryption (FHE) to continuously upload patients' medical data to service providers in encrypted form. This approach ensures that sensitive medical information remains confidential throughout its life cycle, which not only enhances patient privacy protection, but also enables healthcare institutions to seamlessly process and analyze data.
Data Mining: Mining large data sets can produce significant insights, but often at the expense of user privacy. (Yang, Zhong, and Wright, 2006) solved this problem by applying functional encryption in the context of fully homomorphic encryption (FHE). This approach makes it possible to extract valuable information from huge data sets without compromising the privacy of the individuals whose data is being mined.
Financial Privacy: Imagine a scenario where a company has sensitive data and proprietary algorithms that must be kept confidential. (Naehrig et al., 2011) proposed using homomorphic encryption to solve this problem. By applying fully homomorphic encryption (FHE), companies can perform necessary computations on encrypted data without exposing the data or algorithms, thereby ensuring financial privacy and protection of intellectual property.
Forensic Image Recognition: (Bosch et al., 2014) describes a method for outsourcing forensic image recognition using fully homomorphic encryption (FHE). This technology is particularly beneficial to law enforcement agencies. By applying FHE, police and other agencies can detect illegal images on hard drives without exposing the content of the images, thereby protecting the integrity and confidentiality of data in investigations.
Fully homomorphic encryption has the potential to revolutionize the way we handle sensitive information in a variety of fields, from advertising and healthcare to data mining, financial security, and law enforcement. As we continue to develop and improve these technologies, the importance of protecting privacy and security in an increasingly data-driven world cannot be overstated.
Limitations of Fully Homomorphic Encryption (FHE)
Despite its potential, there are still some key limitations that need to be addressed
Multi-user support: Fully homomorphic encryption (FHE) allows computations to be performed on encrypted data, but the complexity increases exponentially in scenarios involving multiple users. Typically, each user's data is encrypted with a unique public key. Managing these different datasets, especially at scale given the computational demands of FHE, becomes impractical. To this end, researchers such as Lopez-Alt et al. proposed the multi-key FHE framework in 2013 to allow simultaneous operations on datasets encrypted with different keys. While promising, this approach introduces an additional layer of complexity and requires careful coordination in key management and system architecture to ensure privacy and efficiency.
Large-scale computational overhead: At the heart of fully homomorphic encryption (FHE) is its ability to perform computations on encrypted data. However, this capability comes with a significant price. FHE operations have a significantly increased computational overhead compared to traditional unencrypted computations. This overhead is usually polynomial in nature, but involves high-degree polynomials, which exacerbates the runtime and makes it unsuitable for real-time applications. Hardware acceleration for FHE represents a large market opportunity to reduce computational complexity and increase execution speed.
Limited Operations: Recent advances have indeed broadened the scope of fully homomorphic encryption to support a wider variety of operations. However, it is still primarily applicable to linear and polynomial computations, which is a significant limitation for AI applications involving complex nonlinear models such as deep neural networks. The operations required by these AI models are challenging to efficiently execute under current fully homomorphic encryption frameworks. Although we are making progress, the gap between the operational capabilities of fully homomorphic encryption and the requirements of advanced AI algorithms remains a key barrier that needs to be overcome.
Fully Homomorphic Encryption in the Context of Cryptography and AI
Here are some of the companies working on leveraging fully homomorphic encryption (FHE) for AI applications in the crypto space:
Zama offers Concrete ML, a set of open source tools designed to simplify the process of using fully homomorphic encryption (FHE) for data scientists. Concrete ML is able to transform machine learning models into their homomorphic equivalents, enabling confidential computation on encrypted data. Zama's approach enables data scientists to leverage FHE without deep cryptography knowledge, which is particularly useful in fields such as healthcare and finance where data privacy is critical. Zama's tools facilitate secure data analysis and machine learning while keeping sensitive information encrypted.
Privasee is focused on building a secure AI computing network. Their platform leverages fully homomorphic encryption (FHE) technology to enable multiple parties to collaborate without revealing sensitive information. By using FHE, Privasee ensures that user data remains encrypted throughout the AI computation process, thereby protecting privacy and complying with strict data protection regulations such as GDPR. Their system supports a variety of AI models, providing a versatile solution for secure data processing.
Octra combines cryptocurrency with AI to improve digital transaction security and data management efficiency. By combining fully homomorphic encryption (FHE) and machine learning technology, Octra is committed to enhancing the security and privacy protection of decentralized cloud storage. Its platform ensures that user data is always encrypted and secure by applying blockchain, cryptography and AI technologies. This strategy builds a solid framework for digital transaction security and data privacy in the decentralized economy.
Mind Network combines fully homomorphic encryption (FHE) with AI to achieve secure encrypted computing during AI processing without decryption. This promotes a privacy-preserving, decentralized AI environment that seamlessly integrates cryptographic security and AI capabilities. This approach not only protects the confidentiality of data, but also enables a trustless, decentralized environment where AI operations can be performed without relying on a central authority or exposing sensitive information, effectively combining the cryptographic strength of FHE with the operational needs of AI systems.
The number of companies operating at the forefront of fully homomorphic encryption (FHE), artificial intelligence (AI), and cryptocurrency remains limited. This is primarily due to the significant computational overhead required to effectively implement FHE, which requires powerful processing power to efficiently perform cryptographic calculations.
Conclusion
Fully homomorphic encryption (FHE) offers a promising approach to enhancing privacy in AI by allowing computations to be performed on encrypted data without decryption. This capability is particularly valuable in sensitive fields such as healthcare and finance, where data privacy is critical. However, FHE faces significant challenges, including high computational overhead and limitations in processing nonlinear operations necessary for deep learning. Despite these obstacles, advances in FHE algorithms and hardware acceleration are paving the way for more practical applications in AI. Continued developments in this area are expected to greatly enhance secure, privacy-preserving AI services that balance computational efficiency with strong data protection.