Author: 0xTodd Source: X, @0x_Todd
The market has not been good recently, and I finally have some time to continue to share some new technical routes. Although the encryption market in 2024 is not as turbulent as in the past, there are still some new technologies trying to mature, such as the topic we are going to talk about today: "FHE/Fully Homomorphic Encryption".
V God also published an article specifically about FHE in May this year, and I recommend it to those who are interested.
So what exactly is FHE?
To understand the awkward term FHE fully homomorphic encryption, you must first understand what "encryption" is, what "homomorphic" is, and why "full" is needed.
1.What is encryption?
Everyone is most familiar with ordinary encryption. For example, Alice wants to send a message to Bob, such as "1314 520".
Now, if we want a third party C to send the message and keep the information confidential, it is very simple - just encrypt each number by x2, for example, it becomes "2628 1040".
When Bob receives it, he divides each number by 2 in turn, and decrypts it to find out that Alice is saying "1314 520"
See, the two people have completed the information transmission through symmetric encryption, without hiring C to help but without C knowing the information. Generally, in spy movies, the communication between two liaisons will not exceed this scope.
Second, what is homomorphic encryption?
Now Alice's requirements have been upgraded:
For example, Alice is only 7 years old;
Alice can only calculate the simplest arithmetic such as x2 and ÷2, and does not understand other operations.
Okay, now suppose Alice has to pay the electricity bill. Alice's monthly electricity bill is 400 yuan, and it has been in arrears for 12 months.
However, 400*12=how much, this question is beyond the calculation range of Alice, who is only 7 years old. She can't do such a complicated calculation.
However, she doesn't want others to know how much/how many months her electricity bill is, because this is sensitive information.
Therefore, Alice asks C to help her calculate without trusting C.
Because she only knows x2 -2, she uses x2 multiplication to simply encrypt her numbers, so she tells C to calculate 800x24=how much, that is: (400x2) multiplied by (12x2).
C is an adult, with a strong calculation brain, and quickly calculated 800*24=19200, and told the number to Alice. Then, Alice will calculate the result, which is 19200÷2÷2, and soon find out that she has to pay 4800 yuan for water.
See? This is the simplest multiplication homomorphic encryption. 800*24 is just a mapping of 400*12. The form before and after the transformation is actually the same, so it is called "homomorphic".
This encryption method realizes: Someone can entrust an untrusted entity to calculate the result, but can ensure that his sensitive numbers are not leaked.
3. Why does "homomorphic encryption" need to be "full"?
However, this is just a problem in an ideal world. Problems in the real world are not that simple. Not everyone is 7 years old, or as honest as C.
Let's assume a very bad situation. For example, C may try to reverse the deduction. C can also decipher that Alice wants to calculate 400 and 12 through exhaustive method.
At this time, "full homomorphic encryption" is needed to solve it.
Alice multiplies each number by x2, and this 2 can be regarded as a noise. If the noise is too little, it is easy to be cracked by C.
Therefore, Alice can introduce an addition on the basis of multiplication.
Of course, it would be best if the noise is like a main road intersection at 9 o'clock in the morning, then the difficulty of cracking C would be harder than climbing to the sky.
Therefore, Alice can multiply 4 times and add 8 times, which greatly reduces the probability of C being cracked.
However, Alice is still only "partially" homomorphic encryption, that is:
(1) The content she encrypts can only be used for a specific part of the problem;
(2) She can only use a specific part of the operation rules, because the number of additions and multiplications cannot be too many (generally no more than 15 times)
And "full" means that Alice should be allowed to perform addition encryption and multiplication encryption any number of times for a polynomial, so that a third party can be entrusted to complete the calculation and the correct result can be obtained after decryption.
A super long polynomial can almost express most of the mathematical problems in the world, not just the problem of calculating electricity bills for 7-year-olds.
Adding any number of encryptions,it is almost impossible for C to spy on private data, and truly achieves "both".
Therefore, the technology of "fully homomorphic encryption" has always been a pearl on the holy grail of cryptography.
In fact, the technology of homomorphic encryption only supported "partial homomorphic encryption" until 2009.
In 2009, the new ideas proposed by scholars such as Gentry opened the door to the possibility of fully homomorphic encryption. Interested readers can also move to this paper.
Many friends are still confused about the application scenarios of this technology. What scenarios will require the use of fully homomorphic encryption (FHE) technology?
For example, AI.
Everyone knows that a powerful AI needs enough data to feed it, but the privacy value of many data is too high. So can FHE be used to achieve both?
The answer is yes
You can:
(1) Encrypt your sensitive data using FHE;
(2) Use the encrypted data to let AI calculate;
(3) Then AI spits out a bunch of garbled code that no one can understand.
Unsupervised AI can achieve this because the data is essentially a vector. AI, especially generative AI such as GPT, does not understand the words we input to it at all. It just "predicts" the most appropriate answer through the vector.
However, since this mess of code follows certain mathematical rules, and you are the one who encrypted it, then:
(4) You can disconnect from the network and decrypt the mess of code locally, just like Alice;
(5) Then, you have achieved:Let AI use huge computing power to help you complete the calculation without touching your sensitive data at all.
But the current AI cannot do this, and must give up privacy. Think about everything you input to GPT in plain text! To achieve this, FHE is indispensable.
This is the root of the natural fit between AI and FHE. Thousands of words can be summed up in one word: both.
Since FHE is linked to AI and spans the two major fields of encryption and AI, it naturally receives extra favor. There are many FHE projects, such as Zama, Privasea, Mind Network, Fhenix, Sunscreen, etc., and the directions of FHE applications are also creative.
Today, let's analyze one of the projects @Privasea_ai.
This is a FHE project led by Binance. Its white paper describes a very appropriate scenario, such as face recognition.
Both: the machine computing power can determine whether the person is real;
and: the machine does not handle any sensitive facial information.
Introducing FHE can effectively solve this problem
However, if you really want to do FHE calculations in the real world, you need a very large amount of computing power. After all, Alice needs to do "arbitrary" addition and multiplication encryption. Whether it is calculation, encryption, or decryption, it is a process that consumes a lot of computing power.
Therefore, Privasea needs to build a powerful computing power network and supporting facilities. Therefore, Privasea has proposed a PoW+PoS network architecture to solve the problem of this computing power network.
Recently, Privasea has just announced its own PoW hardware, called WorkHeart USB, which can be understood as one of the supporting facilities of Privasea's computing power network. Of course, you can simply understand it as a mining machine.
The initial price is 0.2 ETH, which can dig out 6.66% of the total generation of the network.
There is also a PoS-like asset called StarFuel NFT, which can be understood as a "work certificate", with a total of 5,000.
The initial price is also 0.2 ETH, and you can get 0.75% of the total tokens of the network (through airdrops).
This NFT is also interesting. It is a PoS-like, but not a real PoS. It is trying to avoid the question of "Is PoS a security in the United States?"
This NFT supports users to pledge Privasea tokens, but it does not directly generate PoS income, but doubles the mining efficiency of your bound USB device, so it is a disguised PoS.
PS: I have invested in this project before, so I have a discounted mint early bird invitation code siA7PO. Please take it if you are interested.
nft.privasea.ai/
Back to the topic, if AI can really popularize FHE technology on a large scale, it will be a boon for AI itself. You should know that many countries now focus on data security and data privacy in regulating AI.
Even, to give an inappropriate example, in the Russian-Ukrainian war, some Russian military tried to use AI, but considering the American background of a large number of AI companies, the intelligence department would probably be riddled with holes.
But if AI is not used, it will naturally fall behind by a large margin. Even if the gap may not be big now, given another 10 years, perhaps we can't imagine a world without AI.
Therefore, data privacy exists everywhere in our lives, from wars between two countries to face unlocking of mobile phones.
In the era of AI, if FHE technology can be truly mature, it will undoubtedly be the last line of defense for mankind.