1 Introduction
Digital Signature (DS) developed from public key cryptography, is a cryptographic technology with a signature function, which can guarantee the data integrity, non-repudiation, and anonymity of digital information, and plays an important role in the field of information security. With the current development in the digital era, threshold signatures have gained prominence in the public sphere because of their effective guarantee of the security of keys, and thus have attracted many researchers to carry out thresholding studies on digital signatures.
In the area of information security, the threshold signature maintains the security of keys in many applications as the underlying technology, which allows multiple participants to collaborate on signatures. Even if some of the holders’ keys are stolen or tampered with, the signature can normally be completed as long as the specified threshold can be reached. However, threshold signature schemes still have limitations in performance or security. According to the survey, there have been several threshold signature schemes with security flaws.
Since Desmedt and Frankel first proposed the threshold signature scheme [
1,
2], threshold signatures have attracted extensive attention and research in the cryptographic community. Many government agencies are also looking at the development of threshold signatures, such as National Institute of Standards and Technology (NIST), which called attention to multi-party collaborative schemes in 2023 [
3]. In the industry, threshold signatures have become a core technology for blockchain platforms and fintech companies, safeguarding the execution security of multi-party custodial wallets and smart contracts.
Over the past five years, threshold signature schemes have seen an increased presence in top cryptography conferences. Many studies have surveyed and summarized threshold signature schemes. Ergezer et al. [
4] conducted a comparative analysis of threshold signature schemes based on ECDSA, Schnorr, and BLS signatures, highlighting their differences and applications. Aumasson et al. [
5] provided a detailed survey of threshold ECDSA schemes, dissecting their components and benchmarking their performance and properties. Sedghighadikolaei et al. [
6] comprehensively reviewed various threshold signature schemes, including post-quantum threshold signatures and exotic signature variants, along with their application. Jing et al. [
7] aligned the survey with foundational signatures in NIST standard, cataloguing common threshold signature types while exploring threshold signcryption techniques, applications, and standardization progress. However, critical gaps persist in these works: 1) incomplete coverage of signature types (e.g., omitting ECDSA-based threshold designs); 2) vague evaluation metrics for scheme comparison; 3) lack of clear developmental roadmaps for emerging scenarios.
To address these limitations, we investigate the evolution of threshold signatures and review innovative and representative studies from the past decade. Following the framework of basic digital signature types outlined in International Organization of Standardization) (ISO) reports [
8], we select a variety of mainstream signatures based on different algorithmic structures and recent literature, which can be broadly classified into Schnorr, EdDSA, ECDSA, pairing-based, and SM2 signatures. We systematically organize their theoretical advancements and engineering challenges, analyze differences in thresholding designs, and evaluate their applicability across multiple dimensions.
This survey aims to introduce the mainstream threshold digital signature schemes and differ in various aspects including security, functionality, technology, and performance. Our contributions are focused on the following:
● We examine the similarities and differences between threshold Schnorr and EdDSA signatures, investigating the feature of scalability and determinism of these schemes.
● We provide a comprehensive description of threshold ECDSA and SM2 signatures, presenting the relationship between each scheme and improvements, and then categorize the schemes by the technology and functionality.
● We analyze and investigate a variety of pairing-based threshold signatures, including BLS, SM9, and identity-based signatures in IEEE p1363.
● We compile current applications of threshold signatures and difficulties encountered in thresholding digital signatures, and propose thoughts for future directions.
The rest of this paper are organized as follows. Section 2 illustrates the notations and relevant knowledge of threshold signature. In Section 3, we review some of the main threshold Schnorr digital signature schemes and difficulties. Section 4 and Section 5 discuss the construction of the Schnorr signature variants, which is the thresholding of EdDSA and ECDSA schemes. In Section 6, we examine the threshold pairing-based digital signature. We present the SM2-based threshold signature schemes in Section 7. Finally, we investigate the applications in Section 8 and directions of several threshold signatures in Section 9.
2 Preliminaries
In this section, we introduce the fundamentals and related techniques of threshold signature schemes. First, we present the notations in this paper, then we introduce definitions, assumptions and security of threshold digital signature. We also show some cryptographic primitives used in designing threshold schemes. Finally, we give the evaluation criteria for different schemes.
2.1 Notations
Let be the security parameter and be the number of parties. Thus represents the threshold of signature and is the number of maximally corrupt participants. We denote as the th participant in the scheme. and represent private and public key, represent the message. We denote as a secure cryptographic hash function, and we use , and represent the hash function in Schnorr, EdDSA, ECDSA, and BLS signature, respectively. represents the adversary in schemes. Many notations will be described in different signatures.
2.2 Digital signature
Digital signature is a cryptographic technology with a signature function. In DS, a signer holds a pair of key and uses private key to sign a message . Then verifier can check on the integrity of the signature by public key to verify whether the signature is valid. The definition of DS is as follows:
Definition 1 (Digital Signature). A digital signature scheme () involves four algorithms , which is defined as follows:
● : On input the security parameter , this algorithm outputs the system domain parameters.
● : This is a randomized algorithm to initialize the parameters in the scheme and generate a key pair with input.
● : On input the message and private key , this randomized algorithm can generate and outputs the signature .
● : This is a deterministic algorithm to output bit on input of the pair and , which represent the signature is valid iff .
2.3 Threshold digital signature
Definition 2 (Threshold Digital Signature). A threshold digital signature scheme () accomplishes a valid signature iff the number of participants , which consist of four algorithms :
● : On input the security parameter , this algorithm outputs the system domain parameters.
● : Given the parameter , this randomized algorithm outputs the public key and a set of secret keys .
● : This randomized algorithm can generate and output the signautre with the message and keys () input.
● : This deterministic algorithm outputs a bit with the pair of signature and the input, where represent the signature is valid iff .
2.4 Difficult problems and security definitions
Definition 3 (Discrete Logarithm Problem). Given a group and generator , choose an element , it is hard to find an integer such that .
Definition 4 (Elliptic Curve Discrete Logarithm Problem). Given the elliptic curve and the points , it is hard to find an integer such that .
Definition 5 (Elliptic Curve Decisional Diffie-Hellman Problem). Given an elliptic curve group with prime order and generator , compute and randomly choose , where and . For two tuples and , the value of and are computationally indistinguishable.
Definition 6 (EUF-CMA). Given a digital signature scheme DSS=, which is considered a security model under the Existential Unforgeability against chosen-message attack (EUF-CMA). It can be described as follows:
● Setup The challenger runs to get with system public parameter input and send to adversary .
● Query The challenger responds multiple times to the message and signature queries sent by the adversary and runs the signature algorithm and to generate the signature and send it to the adversary .
● Forgery The adversary output a forged signature about . win the game when satisfies both and . We denote this probability as advantage .
A digital signature scheme DSS is secure if the probability is negligible after adversary making signature queries in polynomial time .
Definition 7 (Robustness). A threshold signature scheme is robust if participants comply with the protocol and generate a valid signature, even if participants are corrupted.
2.5 Secure multi-party computation primitive
The concept of secure multi-party computation (MPC) was born out of “The Millionaire” problem for two parties by Yao [
9] in 1982. These technologies can compute the private input distributed in function for all participants and obtain the output without any leak. They are often integrated with technologies from various other domains [
10-
13], and many primitives are used in the schemes to optimize communication and computation.
Threshold signatures are a concrete application of MPC. Leveraging underlying MPC techniques can significantly enhance the applicability, security, and robustness of signature schemes in distributed scenarios. This includes cryptographic primitives such as secret sharing, oblivious transfer (OT), and homomorphic encryption (HE). We will briefly introduce some of the important MPC technologies for threshold schemes.
2.5.1 Secret sharing
Secret sharing is a technology of dividing the secret into multiple sub-secrets and distributing them to different participants to protect the security and availability. The first mention of secret sharing is proposed by Shamir [
14] and Blakley [
15] in 1979. In the design of schemes for threshold signatures, the most commonly used is Shamir secret sharing (SSS) based on the Lagrange polynomials interpolation.
Definition 8 (Shamir secret sharing). Given the secret value , a SSS scheme over denotes that split the to participants and at least participants can jointly reconstruct . Choose a set of parameters and generate a polynomial with order , , the denotes the share of and pair are sent to parties, where . Thus any participants can jointly compute
The SSS scheme only considers the Dealer to be trustworthy. However, if the Dealer does something evil, such as assigning random values to parts of the secret that are inconsistent across participants, the results can be disastrous. The verifiable secret sharing (VSS) was proposed in 1985 by Choc et al. [
16] to solve this problem, and then many VSS protocols were invented.
Feldman’s VSS [
17] provides an efficient method to ensure the accuracy and security of secret distribution for environments that require synchronized communication. Pedersen’s VSS [
18] is based on a commitment scheme for securing the verification of secrets and polynomial coefficients. It makes the verification of secret shares more secure and unbiased by exploiting homomorphic cryptographic properties. A distinctive feature of Pedersen’s VSS is that it resists interference from active attackers and provides higher security.
Publicly verifiable secret sharing (PVSS) [
19] is a publicly VSS scheme that allows anyone (not limited to the participants) to verify the correctness of a secret share. This public verification mechanism enhances the transparency and trust of the system and is particularly suitable for scenarios that require public auditing and verification.
Asynchronous verifiable secret sharing (AVSS) [
20] is a verifiable secret sharing protocol implemented in an asynchronous communication environment. It does not rely on synchronous assumptions, allowing messages to arrive at any time and dealing with message delays and sequential uncertainty. AVSS enhances system robustness and resilience by providing security and accuracy guarantees in asynchronous environments suitable for distributed systems, blockchains, and other asynchronous network environments.
Proactive secret sharing (PSS) [
21] is designed to address long-term security threats by periodically refreshing the secret shares held by participants without changing the original secret. This scheme resists static attacks by constantly refreshing secret shares, increasing the security and robustness of the system for long-running distributed systems and critical infrastructures.
2.5.2 Homomorphic encryption
Homomorphic encryption (HE), one of the fundamental technologies in privacy-preserving computation, was first conceptualized by Rivest et al. in 1978 [
22]. As a cryptographic technique, HE enables direct computations on encrypted data, allowing mathematical operations to be performed on ciphertexts such that the decrypted result matches the outcome of equivalent operations executed on plaintext.
HE is classified into Partially Homomorphic Encryption (PHE) and Fully Homomorphic Encryption (FHE): PHE supports a single type of operation (e.g., addition or multiplication), whereas FHE permits arbitrary combinations of addition and multiplication. Since Craig Gentry’s groundbreaking work in 2009 [
23], which introduced the first practical FHE scheme based on ideal lattices, numerous FHE variants—such as BFV [
24], GSW [
25], and CKKS [
26] have been proposed. Subsequent advancements have focused on algorithmic improvement and hardware acceleration to enhance the computational efficiency of homomorphic encryption systems.
2.5.3 Oblivious transfer
Oblivious transfer (OT) is a cryptographic protocol designed to protect privacy in communication between two parties, ensuring selective privacy during message exchange. OT guarantees that the sender cannot determine which messages the receiver has selected and the receiver only obtains the specific information they chose and gains no knowledge about unselected messages.
The first OT protocol was proposed by Rabin in 1981 [
27], but it achieved message delivery with only a
success probability. In 1985, Even et al. [
28] introduced the 1-out-of-2 OT protocol, laying the foundation for modern OT frameworks. Subsequent research expanded OT protocols based on diverse cryptographic primitives and mathematical hardness assumptions, such as the development of 1-out-of-n OT [
29], Random OT (ROT) [
30], and OT extension [
31] techniques. Today, OT serves as a foundational component of MPC, enabling its widespread adoption in cryptographic schemes and real-world applications—from privacy-preserving data analysis to secure federated learning.
2.6 Evaluation criteria
In this paper, we investigate two aspects of the threshold signature scheme, namely performance and security.
2.6.1 Performance
The performance criteria for threshold signatures are contained in the following categories:
● Communication rounds. In a multi-party collaborative task of completing a threshold signature scheme, each participant needs to compute and send intermediate variables, and receive the information sent by the other participants. Communication round refers to the number of times information is exchanged between participants during the execution of the protocol, and each round of communication typically involves all participants sending and receiving a specific type of information. Until now, many algorithms have supported the pre-signing phase, i.e., participants can execute some steps in the offline phase that do not require the involvement of message , thus reducing the number of rounds of online interactions between participants.
● Communication overhead. Communication overhead refers to the amount of resources consumed and data transmitted by participants to exchange information for signature generation during protocol execution. In addition to the computational intermediate variables of signature, there are commitments and zero-knowledge proofs for interactions in the interaction process.
● Computation complexity. This criterion contains the computation overhead of local computation by each participant. Participant local computation includes the intermediate variables for signatures and the steps to implement the various functionalities and the cryptographic primitives.
2.6.2 Security
Security criteria cover various aspects, including threshold optimality, security assumption, and adversary behaviour:
● Threshold optimality. This criterion means that the total number of participants in a
threshold signature algorithm need only satisfy
. Earlier threshold signature algorithms [
32] require
, i.e., if
participants can cooperate to produce a complete signature, the private key needs to be split to at least
nodes, which increases the operation cost and the risk of private key leakage.
● Security assumption. In security proofs, the process of solving some difficult problem is statistically represented as an attack on the scheme, i.e., the security of a cryptographic scheme is reduced to the difficulty of the assumptions. The algorithm is secure if the problem is recognized as difficult or unsolvable. Weaker security assumptions indicate that the algorithm is more secure and has fewer security dependencies. The threshold signature schemes usually involve strong RSA assumptions and standard ECDSA assumptions, etc.
● Adversary behaviour. Adversary behaviour can be described in several ways:
1) The semi-honest adversary tries to obtain as much information and corrupt other parties but follows the protocols. The malicious adversary will corrupt parties and cause them to deviate from the protocols.
2) When there are more than half the number of honest participants in the model, it is called honest majority. Conversely, when the adversary corrupts more than half of the participants, it is called dishonest majority.
3) Based on the ability of the adversary to corrupt, they are categorized into Static and Adaptive models. The first model can be called Static when the adversary selects parties to corrupt before the start of the protocol and keeps it unchanged during the protocol process. The Adaptive model means that the adversary can corrupt honest participants at any time.
3 Threshold schnorr digital signature
Based on the threshold signatures mentioned in the ISO’s report [
8], we first investigate the Schnorr signature [
33] and its threshold schemes based on the Discrete Logarithm Problem (DLP). Variants of Schnorr signatures include EC-Schnorr, EdDSA, and ECDSA over the elliptic curve. We describe Schnorr signatures on integer groups and their associated definitions below.
Definition 9 (Schnorr Digital Signature Scheme). The Schnorr digital signature scheme [
33] consists of four algorithms, the definition is described as follows:
1. : On input the security parameter , this algorithm outputs public parameters , where are two primes, is Schnorr group with prime order and generator , is hash function.
2. : Choose a private key and compute the public key .
3. : The signer choose a nonce and compute , then compute the signature pair to output for the message , where .
4. : The verifier computes by and the signature , and checks whether . If the equation holds, the verifier outputs .
The linearization advantage of the Schnorr signature structure makes it easy to aggregate signature splits. The security of Schnorr signatures in multi-party environments is a challenge, such as whether the scheme can successfully output legitimate signatures. During the signature aggregation phase, it is important to prevent Rogue Key Attacks by adversaries and avoid forging illegal signatures. In addition, how to increase the number of maximum corrupted participants to improve security has become the focus of scholars’ research.
3.1 Robust and non-robust threshold schnorr signature
Early thresholding of Schnorr signatures did not attract much attention from researchers, whose studies focused on the design of Schnorr signatures. Therefore, we denote the scheme in which valid signatures can be accomplished by any participants correctly following the protocol as robust and the scheme in which the deviation of the participants from the protocol is detected as non-robust.
3.1.1 Robust schemes
In 2001, Stinson et al. [
34] directly utilized the Distributed Key Generation (DKG) protocol in [
35] to perform key generation and distribution in a multi-participants environment. This scheme requires at least four rounds of communication by the participants, of which the DKG during the signing process accounts for three rounds. Similarly, Gennaro et al. [
36] improved the DKG protocol of Pedersen [
37] with only two rounds of operation in 2003, which differs from [
34] in that the random number
will be generated in the signature phase. They further proposes a method involving a trusted third party to perform signature aggregation, which was later adopted in subsequent work [
38].
However, the above schemes have some limitations, such as not being able to achieve threshold optimality, i.e., not being able to satisfy the setting that only requires . Also, these schemes are unable to distinguish participant failure behaviour, and the other signers will reconstruct the key shares even if a signer has a benign failure instead of a malicious failure.
Joshi et al. [
39] proposed an asynchronous scheme ATSSIA with non-interactivity based on problems with FROST, which reduces security to the DLP problem while supporting dishonest majorities. Their protocol performs twice DKG per signature and is robust and non-interactive in the actual signature phase, but still can not avoid the high cost of many non-robust and synchronous steps involving multiple rounds in the pre-signing phase.
Ruffing et al. [
40] proposed methods to provide robustness in the sign phase. The paper constructs a wrapper that only needs one preprocessing and one online signing round to complete the signature, making the protocol robust and asynchronous. This wrapper can be applied to FROST [
38] while satisfies the condition of non-robustness, provides robustness by having concurrent signing sessions, and provides the functionality of identifiable aborts. Finally, this paper points out that their wrapper can be combined with the scheme proposed by González et al. [
41], which guarantees robustness in the key generation phase in [
38], to achieve a fully robust threshold Schnorr signature scheme.
Benhamouda et al. [
42] proposed a new threshold Schnorr scheme called SPRINT for a large number of participants scenario that guarantees robustness on asynchronous networks with high throughput. This paper considers protocol requirements based on a blockchain scenario and aims to generate multiple Schnorr signatures at once to spread the protocol cost. Their signature phase also uses an optimized DKG method to generate message-independent random numbers in two rounds, and then generates multiple signatures in a non-interaction theory. To ensure that the protocol performance is not affected in the presence of network latency of the participants, they forgo complete secret sharing [
43] and use an early negotiation protocol. To maximize the signature efficiency, they proposed a component combining packed secret sharing [
44] with super-invertible matrix [
45] bonding called extreme packing, and proposed to share long-term keys in the packing vector so that each participant can perform multiplication and randomization locally. The cost, however, makes it possible to reduce the number of maximally corrupt participants
to
, where
is an efficiency parameter. The security of this paper only relies on the DL problem under the programmable Random Oracle Model (ROM) model, and Shoup [
46] points out that this scheme can be combined with FROST [
38] to improve security.
In 2024, Groth et al. [
47] also proposed another asynchronous robustness protocol. Unlike SPRINT, this protocol has optimal resilience that
. In this paper, the authors proposed a GoAVSS protocol and a new super-invertible matrix algorithm to support their MPC engine. The GoAVSS protocol is responsible for the distribution and verification of secrets and is constructed based on the protocol of [
48], which uses error-correcting codes to avoid the overhead of polynomial commitments. The super-invertible matrix efficiently combines temporary public keys to generate fewer public keys. Thus their scheme supports an efficient pre-signature generation process, which generates a large number of pre-signatures in the offline phase through batch processing techniques, thus reducing the computational and communication cost in the online signature phase. This design greatly improves the throughput of signature generation and supports the generation of a large number of signatures per second.
3.1.2 Non-robust schemes
In 2021, Komlo and Goldberg proposed a non-robust scheme called the FROST protocol [
38]. The program suspends the protocol and restarts when any participant is detected to have deviated from the protocol and can identify malicious participants. The Keygen protocol in FROST is a variant of Pedersen’s DKG [
37], which introduces zero-knowledge proofs of secret values to prevent Rogue Key Attacks [
49] in
case. The FROST signature protocol is a two-round interaction protocol. Since the first round of operations does not involve the message
to be signed, it can be executed in the pre-processing phase. The FROST protocol has a Signature Aggregator role mentioned in [
36], which collects the intermediate variables computed by all the participants and composes a list
, where secret share
and
. Then, the aggregator sends the list and the message
to the selected participants for the following computation and verification, which is resistant to attack in [
50]. Each signature participant can replace this role in the implementation so that messages previously sent to the signature aggregator are instead sent to all the other participants.
3.2 Security assumption
After FROST’s birth, many researchers began to invest more attention to security and safety assumptions.
To study the security of Schnorr threshold signatures more uniformly, in 2022, Bellare et al. [
51,
52] proposed a new security-definition structure called TS-UF-
, for
, and designed the FROST2 algorithm in [
51]. In the static corruption model, FROST and FROST2 can statute security proofs to the One-More Discrete Logarithm (OMDL) [
53] assumption under the ROM. The scheme devises a variant of Pedersen’s DKG protocol [
37] in the key generation phase by adding proofs of possession, called PedPoP. Schnorr Knowledge of Exponent Assumption (Schnorr-KOE) is introduced to guarantee the operation of PedPoP in the dishonest majority case, and the proof of infers that the security of this signature can be statistically up to the DL assumption under the Algebraic Group Model (AGM). Finally, this paper gives the proof of TS-UF
-3 secure for FROST and TS-UF
-2 secure for FROST2.
Lindell proposed a new three-round interaction threshold Schnorr algorithm [
54] in 2022, which similarly avoids the reuse of DKG in each signature session and guarantees point-to-point encrypted transmissions between participants under the PKI system. Unlike FROST, the security of this paper only relies on the DL problem under the ROM model and is resistant to Random inhomogeneities in a Overdetermined Solvable system of linear equations (ROS) attacks. Furthermore, this paper states that the scheme is UC-secure when the zero-knowledge proof and commitment functions used are UC-secure. There are two issues with the paper to be addressed in subsequent work, one is that the use of zero-knowledge proofs results in poor performance, and the other is that the protocol is only secure in the static model. It also points out the conflict of Schnorr’s non-deterministic EdDSA signatures.
In 2023, Crites et al. [
55] proposed a three-round interactive Sparkle+ algorithm, which is the first scheme to fully implement adaptive security relying on strong interactive assumptions. This paper continues Lindell’s approach [
54], again under the ROM assumption and the DLP assumption in being statically secure, but with no need for zero-knowledge proofs online, which is more efficient. Moreover, the security of the algorithm can be generalized to the Algebraic One-More Discrete Logarithm Assumption (AOMDL) problem [
56] under the AGM and the stochastic predicate machine model, but the number of corrupting parties needs to be set up to satisfy
under the adaptive security condition.
In 2024, Bacho et al. [
57] proposed HARTS with high-threshold, adaptive security, and robustness. This scheme ensures the generation of valid signatures in asynchronous networks and maintains security at most
malicious participants. Based on the DKG idea of [
35], HARTS uses asynchronous verifiable secret sharing (AVSS), MVBA [
58], and Superinvertible matrix [
47] to construct Packed Asynchronous Distributed Key Generation technique that ensures security and efficiency in the case of malicious behaviour and network delays. This scheme balances the efficiency and security with an amortized communication cost of
per signature and a constant number of interaction rounds. This scheme relies on the AGM and the OMDL to ensure security under adaptive decay.
Bacho et al. [
59] also proposed Twinkle in the same year, a new scheme for achieving comprehensive adaptive security that uses Tagged Linear Function Families and Decisional Diffie-Hellman (DDH) assumptions to provide non-interactive security proofs. It also relies on the five-move identification to ensure the security and efficiency of the signature process. Twinkle simplifies the implementation complexity compared to Sparkle+ [
55] by not relying on AGM, supporting arbitrary decay thresholds
, and avoiding rollback techniques. This paper points out that there is a gap in Sparkle’s security proofs, i.e., an adversary may send inconsistent sets of promises to different honest parties, resulting in invalid signatures. For this reason, Twinkle introduces new techniques such as the equivalence class to solve this problem. Twinkle’s signature size is at most three times the size of an ordinary Schnorr signature and can be accomplished in only three rounds of interactions.
3.3 Deterministic
In 2021, Garillot et al. [
60] proposed a stateless deterministic signature scheme. This paper focuses on the implementation of stateless deterministic signatures in threshold signature schemes. The traditional Schnorr signature scheme needs to generate a new random number every time a signature is made, which may cause serious security problems in practical applications because the random number generator may suffer from rollback attacks and is vulnerable to factors such as system crashes and malicious attacks. This paper proposes a stateless threshold Schnorr signature scheme, which causes each participant to generate a nonce as the private key, and calculates the random number share through a deterministic algorithm to achieve deterministic random number derivation, avoiding the unreliability of the random number generator. By introducing a UC commitment-based mechanism, the paper implements and constructs a commitment scheme based on Zero-Knowledge from the Garbled Circuit (ZKGC) [
61] with unobtrusive transmissions, which allows the verifier to submit a secret value once and prove its correctness several times afterwards. Also, the paper proposes a cryptographic device for converting intermediate cryptographic circuit line labels into arithmetic codes, thus simplifying the process of zero-knowledge proofs. This scheme possesses fewer rounds and improves the performance compared to the scheme using MPC protocol [
62].
The first two-party Schnorr threshold scheme with statelessness and determinism was presented by Kondi et al. [
63]. This paper uses Pseudorandom Correlation Functions (PCF) to generate unpredictable pseudo-random numbers. Two instantiations are constructed based on Vector Oblivious Linear Evaluation (VOLE), one is a protocol based on [
64] that satisfies covert security [
65], and the other is an active malicious security protocol based on [
66]. The first protocol achieves security based on PRF and OT assumptions, while the second achieves security based on Decisional Composite Residuosity (DCR) and strong RSA assumptions, but it is still not possible to achieve both covert and malicious security. The stateless design of this scheme eliminates the state dependency in the signature process and prevents the security risk of state reuse while completing the signature in two rounds of interaction. The limitation of this scheme is that the constructed PCF only supports two-party collaboration, and the extension to
t-party is still a problem to be realized.
3.4 Other functionality
Boneh et al. [
67] proposed a new scheme TAPS in 2022. This paper points out that both the present private threshold signature (PTS) and accountable threshold signature (ATS) schemes do not balance the characteristics of both, and thus TAPS is proposed to guarantee the tracking of the signer if it is necessary in some cases, but maintain privacy in the ordinary case. The main feature of the scheme is to protect the privacy of the signer through public key encryption and zero-knowledge proofs while allowing the signer to be traced when necessary. In the construction of the scheme, they use ElGamal encryption for encrypting part of the information of the combined signature to ensure the privacy of the set of signers and
protocol or Bulletproofs [
68,
69] for proving the validity of the encrypted signatures without revealing the information of the set of signers. The scheme requires three rounds and one round of interactions in the signature generation and verification process respectively, with relatively low communication and computation overheads, especially when Bulletproofs are used, and the length of the signatures and proofs are shorter, which makes it suitable for large-scale signer collections. Combiner collects the signature shares of the signers, generates the ATS signatures, encrypts them as
, and generates the zero-knowledge proofs
, and finally outputs the signature
. This scheme allows only the person who owns the traceability key to track down the signer’s information when necessary, providing a secure and efficient solution for a variety of application scenarios.
Finally, we give Tab.1 and Tab.2 to summarize some of the schemes in recent years, which contain the network of the scheme, the maximum corrupted parameter , the number of signature rounds, the attack model, the robustness, and the security assumptions.
4 Threshold EdDSA digital signature
The edwards-curve digital signature algorithm (EdDSA) [
70,
71] is the variant of Schnorr signature on the twisted Edwards curves, and its security is built based on Elliptic Curve Discrete Logarithm Problem (ECDLP). The relevant definitions of EdDSA are described as follows:
Definition 10 (EdDSA Digital Signature Scheme). Given a twisted elliptic curve , the algorithms of EdDSA signature are described as follows:
1. : On input the security parameter , this algorithm output the system domain parameters , where is an additive cycle group with order and generator , and is a hash function.
2. : Select a secret key and compute . Then set and , thus define the secret scalar and compute , where
3. : Compute a randomness , where is the right half of hash value. Then compute and . Finally output the signature .
4. : Verifier first computes and check whether holds, if it is then output , otherwise output .
The choice of parameters in EdDSA has a significant impact on security, and the specific parameter selection can be found in [
72]. EdDSA is a deterministic algorithm with random numbers that can be derived from a pseudorandom function PRF, thus there is a risk of disclosing the private key
.
In 2020, Feng et al. [
73] proposed the first two-party EdDSA scheme, which generates the signature with interactive protocol securely. This paper implements the protocol on mobile devices and gives the security proof in ROM.
In 2021, Bonte et al. [
62] proposed a threshold HashEdDSA signature scheme to cope with the high value and potential risks of generating valid signatures in application scenarios such as blockchain. Their contributions are the demonstration of efficient threshold HashEdDSA without modifying the behaviour of the signature algorithm through the use of a doubly-authenticated bit (daBit) generation protocol suitable for
access structures, and the proposed use of a Rescue hash function suitable for MPC to improve performance. On the technical side, the paper cites and optimizes a daBit generation protocol for MPC that allows efficient conversion between binary and large prime fields. Security is based on the active adversary model and the discrete logarithm problem to ensure secret key security. Performance evaluation shows that the use of Rescue hash functions significantly reduces the computational overhead, especially in the case of long messages. Compared with other schemes, this method does not require modification of standard algorithms and is suitable for existing systems.
Feng et al. [
74] presented a new multi-party EdDSA signature protocol with stateless and deterministic works in a full threshold setting and can tolerate all but one malicious corruption. The stateless and deterministic generation of random numbers is verified via Multi-Verifier Zero-Knowledge proof (MVZK) by generating and converting Information-Theoretic Message Authenticated Codes (IT-MACs) via PCF and mv-edaBits, and generalized to the multi-verifier case. The stateless and deterministic generation of random numbers is ensured through PCF and mv-edaBits, and their correctness is verified. In the two-party case, this paper stays secure based on the LPN assumption and takes only two rounds as in [
63], but the random numbers in [
63] are derived using customized functions instead of PCF, and thus cannot be adapted to EdDSA. whereas, in the multi-party case, the new scheme reduces the communication cost by a factor of 56 at the expense of an increase in the computational cost as compared to [
60], and is also done in three rounds. The scheme significantly improves efficiency and especially stands out in reducing the cost of communication.
In 2024, Xie et al. [
75] proposed an EdDSA-based responsibility threshold signature protocol called TAPS-PR, which addresses the lack of comprehensive implementation of responsibility, privacy, and key protection in blockchain systems in existing research. TAPS-PR utilizes zero-knowledge proof to guarantee the correctness and privacy of update keys and introduces the ElGamal encryption to hide the threshold value and the set of signers, thus improving privacy protection. In addition, the protocol introduces an entity called Tracer that enables liability tracking by using a secret tracking key to track the specific signer that generates the threshold signature in the event of a fraud event. For security purposes, the Tracer verifies the legitimacy of the signature using zero-knowledge proof techniques and ensures that the tracking process does not reveal specific information about the signer. In addition, the paper designs a more efficient protocol, ATS-PR, which reduces the communication and computation overhead. Through experiments and theoretical analysis, the superiority of the proposed protocol in terms of communication and computation overhead is proved, while its compatibility and effectiveness are demonstrated in real applications.
Finally, we give Tab.3 to summarise the threshold EdDSA protocols on deterministic property.
5 Threshold ECDSA digital signature
The Elliptic Curve Discrete Logarithm Problem (ECDLP)-based threshold Elliptic Curve Digital Signature Algorithm (ECDSA) signature has been receiving a lot of attention lately, mainly because it is the signature scheme supported by the vast majority of cryptocurrencies recently. The details of the ECDSA definition are as follows:
Definition 11 (ECDSA Digital Signature Scheme). Given a elliptic curve , a base point over additive cycle group with order , and a hash function , the algorithm of ECDSA signature are as follows:
1. : On input the security parameter , the algorithm output system public parameters .
2. : Choose a private key , compute public key .
3. : For the message , choose a randomness , compute and , then compute part of signature and output .
4. : On input of message , signature and public key , verifier check the range of signature value. Then verifier computes and output 1 iff .
Many researches aim to optimize the thresholding of ECDSA, which have different design focuses, technical approaches, and cryptographic techniques. There is a certain degree of technology inheritance, but they also has the characteristics of technology independence. The main design difficulty of ECDSA is the sharing of its random number , the private key , and the computation during joint signing. We notice that the structure of ECDSA is nonlinear and cannot be directly linearly aggregated. It also needs to collaborate to compute the inverse element of a shared random number in a multi-party environment. How to solve these difficult problems has become the main direction of research.
In 1996, Gennaro et al. proposed GJKR96 [
32], which is the first ECDSA threshold signature scheme that does not satisfy the property of threshold optimality. In [
32], the private key
and the signing random number
exist in the form of
secret shares between the two parties, with each
holding its own share and computing
directly during the signing process to obtain the share of
, which also leads to the secret-sharing polynomial number rises from
to
. This leads to a threshold
and the number of participants
that need to satisfy
, which is not threshold-optimal.
Based on [
32], a series of technical schemes have emerged around the design goals of threshold optimization, efficiency, and various functionalities. Taking the technical way to realize the threshold optimization as the basic classification dimension, these schemes design MtA protocol by several techniques like distributed Homomorphic Encryption (HE), Oblivious Transfer (OT), etc.
In 2015, Gennaro et al. applied threshold ECDSA signatures to Bitcoin wallets in [
76], and then proposed the concept of threshold optimality for the first time in [
77], and designed the first ECDSA threshold optimal signature algorithm. This paper uses additive secret sharing for keys and random numbers, encrypted by an additive homomorphic encryption algorithm, the signature process is multiplication and inverse operation in the ciphertext state, and finally generates the ciphertext of the signature, and all the parties cooperate to decrypt to get the signed plaintext data. The key step is that all parties generate a set of additive homomorphic encryption algorithm parameters through a trusted third party or a public verifiable algorithm, the encryption key is public, and the decryption private key is distributed in the form of shares in each party. However, the computational overhead is high due to the use of many zero-knowledge proofs and commitments in the interaction process.
The design idea of [
78] is similar to that of [
77], and its improvement is to replace the encryption algorithm from an additive homomorphic encryption algorithm to a
-order fully homomorphic encryption algorithm [
79], so that any times of addition and multiplication of the encrypted parameter with the depth of 1 can guarantee the homomorphism, thus the number of node interactions can be reduced from six rounds to four rounds.
Although the ECDSA threshold signature scheme based on distributed homomorphic encryption realizes the nature of threshold optimality, there is a very fatal problem, i.e., the achievability/practicality is unknown, because when the number of nodes is more than 2, how to generate the key of a homomorphic encryption algorithm (e.g., Paillier Encryption) in a distributed manner is an uncertain problem.
ECDSA Threshold Signature based on Multiplicative to Additive (MtA) protocol is a mainstream way to realize ECDSA, which has been widely used in practical business scenarios. The core of this scheme is based on the MtA protocol, which realizes the computation process of
, and at the same time, it does not cause the elevation of the number of polynomials of secret sharing, which satisfies the nature of the threshold optimization. Earlier MtA protocol is based on Paillier homomorphic encryption algorithm implementation, the input of the protocol is multiplicative secret shares
and
, which satisfies
, and the output of the protocol is additive secret shares
and
, which satisfies
. In subsequent works, many different MtA protocols are proposed based on Obvious Transfer (OT), Castagnos-Laguillaumie (CL) [
80], etc.
Lindell et al. [
81] proposed a two-party ECDSA signature scheme based on the Paillier in 2017, which does not need to execute the distributed Paillier key algorithm while directly utilize Paillier homomorphic attributes to complete two-party collaborative signatures, which improves the operational efficiency of two-party collaborative signatures. The signature scheme in this paper includes a DKG protocol and a distributed signature protocol. The key generation phase is more complex than the signature phase because it is necessary to prove that
correctly generates and verifies the Paillier key pair. Moreover, key generation is run only once, so this more costly key generation phase is acceptable.
Gennaro et al. [
82] proposed the first ECDSA threshold signature scheme called GG18 based on the MtAwc (MtA with check) protocol. In this paper, they used Feldman’s VSS method with the two-party protocol originally proposed by Gilboa et al. [
83], which enables the scheme to abort in the presence of a malicious adversary and drastically reduces the computational overhead in the signing process at the cost of increased rounds. In 2020, Gennaro et al. [
84] improved GG18 and proposed GG20, which implements IA (Identifiable Abort) on the previous scheme and splits the signature process into the pre-processing phase and online phase, where offline preprocessing is used to perform computations that are not related to the signed message locally, and only one round of interactions is needed to complete the signature online.
In 2018, Lindell et al. [
85] also proposed a scheme, LNR18, which is similar to GG18. However, the focus is different, the starting point of LNR18 is to question the practicality of the keygen phase of [
77] and [
78]. That is, it is difficult to generate a homomorphic encryption algorithm key in a distributed manner. Based on this realization, LNR18 will use an exponential form of the ElGamal algorithm instead of the Paillier homomorphic encryption algorithm in the secret sharing project. This is because the ElGamal algorithm is linear, so its distributed key generation is simple to implement, and it does not directly decrypt the plaintext to get the plaintext, but rather gets the elliptic curve multiplication point of the plaintext, so as to complete the verification of the correctness of the calculation process.
Castagnos et al. proposed the CCL20 [
86] in 2020. This scheme uses additive secret sharing for keys and random numbers, replacing the Paillier homomorphic encryption algorithm used in GG18 with the CL homomorphic encryption algorithm [
80], thus avoiding the problem caused by the inconsistency in the order of magnitude of the Paillier modulus and the ECDSA modulus. Although the use of CL encryption solves the computational load of needing to introduce a large number of zero-knowledge proofs to complete range proofs, the efficiency of this scheme is in the same order of magnitude as GG18, since the computational effort of CL is much larger than that of Paillier when the message space is the same.
In 2020, Canetti et al. proposed a new scheme with identifiable abort called CGGMP20 [
87]. This scheme focuses on the security guarantee of the computational process by introducing corresponding zero-knowledge proofs at each step, thus ensuring the security of the signature process and avoiding the risk of leaking sensitive information due to signature failure, which is an effective enhancement to the method of verifying the correctness of the signature result in GG18. The solution uses additive secret sharing for keys and random numbers, adds an identifiable abort function, and is resistant to adaptive attacks. The paper also gives two different solutions, i.e., one with four (3+1) rounds of interactions and
computational complexity and the other with seven (6+1) rounds of interactions and
computational complexity.
On the other hand, DKLs18 [
88] first proposes ECDSA threshold signature schemes based on Correlated Oblivious Transfer (COT) in two-party, which are significantly different from the previous technology paths, but the COT is to implement the MtA functionality. Doerner et al. proposed DKLs20 [
89] that improves on DKLs18 and extends it to the multi-party environment, while optimizing the communication and computation process to achieve a 40% performance improvement. The core component of the protocol of DKLs20 is 2-party Multiplication, whose function is similar to that of MtA. The core component of the DKLs20 protocol is 2-party multiplication, which is similar in function to the MtA protocol and is based on the COT implementation. Based on 2-party multiplication,
-party inverse-sampling protocol and 2-party multiplier protocol are implemented to complete the inverse operation and multiplication operation respectively, and finally realize the complete signature process.
In 2023, Doerner et al. proposed a new three-round protocol called DKLs23 [
90] based on OT, which allows these nonlinear relations to be computed in parallel, thus reducing the number of interaction rounds. This paper proposes a two-round Vector Oblivious Linear Evaluation (VOLE) for implementing the multiplication operations required for ECDSA signatures, ensuring security and privacy during the computation process, and generating the required multiplication results by randomizing the input values, reducing the communication overhead in the specific implementation. The protocol design employs statistical consistency checking and no zero-knowledge proofs to verify the inputs and outputs of the parties to ensure security. The security assumptions are based on the ideal VOLE and commitment primitives and the OT implementation under the stochastic predicate machine model. The concrete implementation requires each participant to perform
scalar operations in the signature phase, and the bandwidth cost is significantly reduced. With this approach, the new protocol ensures malicious security and resistance to malicious attacks while reducing the number of interaction rounds and optimizing the communication/computation overhead, demonstrating a significant improvement in the threshold ECDSA protocol.
In 2021, Kondi et al. [
91] proposed an offline refresh protocol for threshold cryptocurrency wallets that allows any
online parties in a threshold wallet to actively refresh in the system at any time, and the remaining offline parties to participate non-interactively at their convenience. This paper focuses on defining the concept of offline refreshing, developing an efficient protocol for the
threshold signature scheme, and constructing an active multiplication mechanism to refresh the state of the threshold ECDSA protocol, but requiring either Shamir’s Secret Sharing (SSS) or additive secret sharing. Security assumptions are based on the mobile attacker model and it is shown that it is impossible to construct offline refresh protocols in the general
case when there is a dishonest majority of online parties. Experimental results show that the scheme has good performance and security in small-scale decentralized applications, but has some limitations in large-scale systems with higher thresholds.
Abram et al. [
92] proposed a new low bandwidth threshold ECDSA scheme. The scheme uses additive secret sharing for keys and random numbers and achieves silent preprocessing by utilizing a pseudo-random correlation generator (PCG) [
93] to allow each participant to generate a large number of random numbers locally using seed, which requires only logarithmic-level communication complexity to generate ECDSA signatures, greatly reducing communication and storage requirements. It also employs Learning Parity With Noise (LPN) and Ring LPN assumptions, Distributed Point Functions (DPF), and other techniques to ensure that the system operates efficiently with active security against an arbitrary number of malicious corrupt parties, and realizes non-interactive signatures, which are well suited for use in large-scale financial applications.
Xue et al. [
94] proposed an efficient two-party ECDSA signature scheme with additive secret sharing for keys and multiplicative secret sharing for random numbers, and the use of mask
to protect
in the random number phase, which reduces one MtA conversion and significantly reduces the online computation and communication overhead. The online phase of the scheme is almost non-interactive and the computational overhead is mainly focused on signature verification. Finally, this paper is optimized and implemented with MtA based on OT, CL, and Paillier.
In 2023, Xue et al. [
95] proposed an efficient MtA function based on the modified Joye-Libert (JL) cryptosystem [
96] and applied it to a threshold ECDSA signature scheme. The new scheme achieves efficient zero-knowledge proof and encryption operations through the JL cryptosystem and JL-based commitment scheme, and proves the security of this MtA scheme based on the strong RSA assumption with the k-QR assumption of the quadratic residue assumption under the k-QR assumption JL modulus). Compared with the existing MtA schemes, it has less message space than Paillier-based and less commitment overhead than CL-based. In batch mode, the communication overhead is reduced by a factor of 2.4 to 2.7 and the computational cost is reduced by a factor of 1.62 to 2.26. These technical optimizations give the new solution a significant advantage in application scenarios that require efficient and secure signatures, such as blockchain and distributed systems.
Wong et al. [
97] presented a robust scheme that achieves
threshold flexibility throughout the pre-signing and signing phases with self-healing via VSS-based Distributed Randomness Generation (DRG) and MtA protocol. The MtA protocol is constructed based on the scheme of GG18 [
82], which can reduce range proof by replacing the Paillier homomorphic encryption therein using the CL of Castgnos et al. [
86]. This scheme requires four rounds of interactions in the pre-signing phase and one round of interactions in the online signing phase, with
computation and communication costs. Compared to CGGMP20 [
87], this scheme introduces zero-knowledge proofs and homomorphic encryption while reducing the number of communication rounds and the cost, ensuring the correctness and security of the parties’ contributions to the protocol. Even without an honest majority, the scheme can continue the protocol without restarting it through a self-healing mechanism and cheating recognition capability, saving time and computational resources and improving security, robustness and efficiency.
Finally, we give Tab.4 and Tab.5 to summarize some of the schemes in recent years, which contains the attack model of the scheme, the number of signature rounds, the message space size, the computational overhead, the UC functionality, IA functionality, and the security assumptions.
6 Threshold pairing-based digital signature
Pairing-based signature schemes utilize the properties of bilinear pairs to achieve efficient and secure signature verification for a wide range of advanced cryptographic applications. The definition of bilinear pairs and their signatures are as follows:
Definition 12 (Bilinear Pairing). Given elliptic curve
and two
-order multiplicative cyclic groups
are generated by the weil pairing [
98] or the tate pairing [
99-
101].
The tuple can be called a bilinear group if it contains three cyclic groups of order and a bilinear map
For all and two generators and of and , it follows the pairing:
then we can define the generator of as .
Thresholding of pairing-based signatures faces the challenge of high resource consumption as they rely on bilinear pairs. The design of threshold BLS signatures faces the verification problem of signature share and also has to consider dynamic membership management in multi-user scenarios. On the other hand, the design of a threshold identity-based signature scheme needs to consider two challenges of how to distribute master private keys or signatures.
6.1 Threshold BLS digital signature
There have been many thresholding studies of BLS signatures in pairing-based signatures. We introduce the definition below:
Definition 13 (BLS Digital Signature Scheme). Given a non-degenerate pairing and a Hash function , the algorithms of BLS signature are as follows:
1. : On input the security parameter , the algorithm output system public parameters .
2. : Choose a private key and compute public key .
3. : For the message , compute and the signature .
4. : On input of signature and public key , the verifier output iff holds.
Boldyreva [
102] was the first to propose a robust actively secure threshold BLS signature scheme. The scheme is based on the Gap Diffie-Hellman (GDH) group [
103], which is simple for DDH and difficult for CDH. This paper uses additive secret sharing for private keys and implements key generation using the DKG protocol of Gennaro et al. [
35]. The scheme’s features include robustness and proactive security, generating valid signatures even with
malicious participants, and preventing cumulative attacks by periodically updating shares. The signature generation process requires no interaction or zero-knowledge proofs, simplifying the protocol implementation.
Bacho et al. [
104] presented adaptive security considerations for threshold BLS signatures. This paper aims to address the security problem in [
102] in response to adaptive attacks, i.e., the lack of effective security proofs against adaptive attacks, leading to possible security vulnerabilities in practical applications. To address this problem, the thesis introduces a new DKG security concept, namely oracle-aided algebraic simulatability, which ensures that the key is maintained even if some of the participating nodes are attacked during the key generation process. Specifically, the paper enhances the protection against node corruption by improving the definition of security in the DKG protocol. It proposes a modular security proof method by combining the assumptions of AGM and OMDL for several DKG protocols with a new definition of security. The scheme has low communication and computation overheads in the signature generation and verification process, thus enhancing the efficiency and security of practical applications.
In 2020, Tomescu et al. [
105] proposed a fast BLS threshold signature scheme based on the aggregability of BLS signatures. This paper uses polynomial secret sharing for private keys, reduces the aggregation time of BLS threshold signatures from
to
by using a fast Lagrange interpolation algorithm, and introduces the Authenticated Multipoint Evaluation Tree (AMT) technique, which reduces the computation time of evaluating proofs of KZG [
106] polynomial commitments from
to
, optimizes the VSS and DKG protocols and reducing the sharing phase time to
and the reconfiguration phase time to
, which significantly improves the efficiency of the system in the case of large-scale participants, despite an increase in the communication and verification time in the case of small-scale participants. The security of the paper is based on the strong bilinear Diffie-Hellman (SBDH) assumption and the polynomial Diffie-Hellman assumption (polyDH).
In 2023, Garg et al. [
107] proposed hinTS, which presents a Silent Threshold Signature (STS) that can change the threshold value
based on the message
without changing the public key, and is also able to support the weighted case. This scheme extends from BLS multi-signatures to threshold signatures by letting participants locally compute and send public “hints” in the signer’s key during the setup phase, denoting the signing participants by
, proving that the number of participants reaches a threshold using the standard SNARK [
108], and performing the private key aggregation by using the method in [
109], given by the aggregator’s computational correctness with respect to the public key
, thus avoiding the high overhead DKG protocol. In addition, the scheme supports weighted thresholds and dynamic selection of signers and thresholds to ensure security under the AGM and Common Reference String (CRS) model. In terms of performance, the signing and verification times are 1 ms and 17.5 ms, respectively, and the aggregation time for 1,000 signers is less than 0.5 s. Although the EVM Gas for verifying the signatures in a single pass on the chain is expensive, it can significantly reduce the overall cost and complexity in long-term use.
In 2023, Das et al. [
110] proposed a scheme with weights. This paper aims to address the limitations of existing schemes in weighted distribution and multi-threshold environments, which cannot effectively handle signers with different weights and usually support only a single fixed threshold every time, which especially limits their application in decentralized systems such as PoS blockchains. The thought of this paper is similar to [
107], extending from multiple signatures with weights to threshold signatures. To solve the inner product problem of
and
, it supports arbitrary weight distributions and multiple thresholds through the use of
inner-product argument (IPA), and also maintains that the size of the signing and verifying keys is independent of the number of signers. The scheme uses efficient Lagrange polynomial computation and a non-interactive preprocessing method for efficient signature aggregation and verification. In terms of security, the scheme is proved to be secure under the algebraic group model and the random oracle model. Although the initial preprocessing requires
computation, the computational overheads of the actual signature and verification processes are constant time and
, respectively, for 4096 signers with a signature size of 536 bytes, with a verification time of about 8.21ms, and an aggregation of signatures of 71 ms.
Finally, we give Tab.6 to summarize some of the threshold BLS signature schemes in recent years. The table contains the attack model of the schemes, the size of partial and aggregate signatures, the security assumptions, and the contribution of these schemes.
6.2 Threshold identity-based digital signature
In addition to BLS signatures, some Identity-Based Signatures (IBS) are also constructed based on bilinear pairs, including the identity-based signatures in IEEE p1363 [
111] and the SM9 signatures. To cope with the leakage of master secret key (MSK) in IBS, we briefly introduce the thresholding schemes for these signatures.
The SM9 standard for marking cryptographic algorithms was published by the State Cryptography Administration of China (SCA) on March 28, 2016, where the digital signature algorithm section can also be used for thresholding.
Definition 14 (SM9 Digital Signature Scheme). The system initialization of SM9 digital signature scheme is given two -order multiplicative cyclic groups with generator and bilinear map , then choose the hash function and , the algorithms of SM9 signature are as follows:
1. : On input the security parameter , the algorithm outputs system public parameter .
2. : The key generation center (KGC) choose a random master private key and compute master public key .
3. : For the user identity , KGC compute , and the private key for user is .
4. : On input , message and master public key , random choose , compute and .
5. : On input , , and , the verifier checks whether , if not, then fail; and compute to check , output if the equation holds, otherwise the signature is invalid.
In 2018, He et al. [
112] disclosed a two-party signature based on SM9. The main idea of the scheme is private key splitting, where KGC generates two secret shares related to the private key and distributes them to two participants. This scheme uses additive homomorphism to compute the second part of the signature and uses zero-knowledge proofs to reduce the risk of data tampering.
In 2019, He et al. [
113] disclosed a method for generating SM9 digital signatures jointly by multiple parties in an asymmetric environment. In this method, the user private key is an additive share
and
serves as the body of the signature, in the process of generating the digital signature, each participant selects a random number
, computes and broadcasts an intermediate variable
. Subsequently, each participant can generate a joint intermediate variable
. Each participant computes the intermediate variable via the multiplication protocol
and sends it to
so that
can compute the second part
of the signature. Similarly, He et al. [
114] disclosed a method for SM9 digital signatures by multiple parties in a symmetric setting in 2019.
In 2020, Mu et al. [
115] proposed a secure two-party SM9 signature scheme. Their protocol achieves strong protection of private keys by combining Paillier encryption and zero-knowledge proofs. However, the high overhead of the protocol in terms of performance limits its application in certain scenarios. Nevertheless, its enhancements in security and privacy provide a reliable solution for digital currency transactions.
In 2021, Zhang et al. [
116] proposed a distributed key generation scheme based on SM9, they pointed out that the generation of a master key in IBC inevitably becomes a security issue that needs to be protected, to solve the single point of failure and the risk of leakage of master key in SM9 system, the authors proposed a
-threshold distributed key generation scheme. The scheme utilizes
and the share transformation algorithm in the
Pseudo-Random Secret Sharing (PRSS) protocol for master key and user-signed private key generation, where
is a variable generated from the replicated secret sharing protocol for auxiliary key generation. The scheme realizes efficient private key generation with 1 round and meets the IND-ID-CCA security standard while ensuring compatibility with SM9, as well as high efficiency and scalability.
In 2020, Feng et al. [
117] proposed a scheme based on IEEE p1363 identity-based signatures. They designed a new multiparty scalar multiplication making
to realize distributed signatures. In the same year, He et al. [
118] also proposed efficient two-party signatures based on the trusted KGC setup. To address the issues of key escrow and secure channels in identity-based signatures, Feng et al. [
119] also proposed a new key generation protocol in 2020. They apply the scalar multiplication
proposed in the previous paper [
117] to compute the intermediate variables and use the random number
to compute the blinding factor to participate in the key generation, thus addressing the need for a secure channel during key distribution. Subsequently, they extended the scheme from
to
threshold signature scheme based on the idea of DKLs19 [
89].
The IBS in IEEE p1363 is based on the BLMQ [
120] signature standardization. Jiang et al. [
121] proposed a fully distributed threshold BLMQ signature scheme in 2023. This scheme was constructed using VSS-Elliptic Curve (VSS-EC) and VSS-Bilinear Group (VSS-BG) and Paillier encryption to design different distributed protocols for master private key generation, private key extraction and signature generation phases, respectively, with recognizable abort functions. Next year, Jiang et al. [
122] proposed a non-interactive protocol that defends against adaptive corruption and enables UC security. This scheme constructs a
protocol with four sub-protocols based on the Paillier and Beaver triples, thus constructing the threshold IBS scheme.
7 Threshold SM2 digital signature
SM2 is an elliptic curve public key cryptographic algorithm released by the State Cryptography Administration of China (SCA) on December 17, 2010, which became a Chinese cryptographic industry standard (GM/T 0003-2012) in 2012 and a Chinese national cryptographic standard (GB/T 32918-2016) in 2016, of which part 2 is a digital signature algorithm. The details of SM2 digital signature are as follows:
Definition 15 (SM2 Digital Signature Scheme). The system initialization of SM2 digital signature scheme is given security parameters , choose a finite domain , generate an elliptic curve equation , the algorithms of SM2 signature are as follows:
1. : On input the security parameter, this algorithm output system public parameters , where is a generator of abelian group with order , is a hash function.
2. : Given system public parameters , random select a private key and compute public key .
3. : For the message , compute , random choose a nonce , compute , and to output the signature .
4. : Given system public parameters , public key , message , signature value , the verifier performs the following steps: if , then the verification fails, calculate the message hash value , calculate , if , then the verification fails, calculate , calculate . Verify whether the equation holds. If it holds, is a legal signature and output , otherwise, the signature is invalid.
In 2014, Shang et al. [
123] proposed the SM2 threshold signature scheme for the first time, based on the secure multi-party computation method realized by the secret sharing idea. The paper uses Shamir’s Secret Sharing (SSS) for both the key and the random number and accomplishes the co-signing operation to guarantee that multiple users securely and collaboratively compute the product and the inverse. This paper also points out that the threshold signature set requires the number of signing participants
to satisfy
.
Lin et al. [
124] designed an SM2 collaborative signature scheme for cloud computing for two-party distributed application scenarios. The basic idea is to split the SM2 private key into two parts, which are stored in the cloud and the terminal, respectively. The signature operation requires the cloud and the terminal to cooperate to complete, thus avoiding the security risks caused by storing all private keys in the terminal.
In 2020, Zhang et al. [
125] proposed a two-party SM2 signature algorithm based on homomorphic encryption and gave provable security guarantees. Still, the signing process needs to use Pailier homomorphic encryption operation, and the execution efficiency of the scheme is low. However, the signature process requires Pailier homomorphic encryption, and the execution efficiency of the scheme is low. Moreover, the scheme requires all the participants to operate together online, and the failure of any one party will lead to the failure of the signature, in some distributed scenarios without trusted centres, such as Bitcoin, Ether, the loss of the private key of any one party will lead to the failure of the execution of the signature. In the same year, Su et al. [
126] proposed an efficient provable two-party signature protocol that deforms the random numbers
in the original algorithm to
to reduce the computation and communication, and proposed an application protocol for enhanced security in practical applications
Feng et al. [
127] designed a lightweight SM2 two-party signing scheme based on the idea of joint signing in 2020, which allows the client and the server to complete the SM2 digital signature without disclosing part of their respective signing keys, the process of generating signatures must be participated by both parties at the same time, and there is no recovery of the complete signing key in the process of generating signatures, to guarantee the security of the signing key.
In 2022, Han et al. [
128] proposed an efficient two-party SM2 signature protocol based on secret sharing and Beaver multiplication triples. The protocol uses two beaver multiplication triples, significantly improves the signature efficiency by reducing the computation and communication overheads, is 30−40 times faster than existing homomorphic encryption schemes, and proves its security under selective message attacks. This scheme relies on a trusted third party and reduces the rounds to five in the case of pre-shared beaver triples, and the signature efficiency is very close to that of the original SM2 algorithm.
In 2024, Liang et al. [
129] proposed a non-interactive SM2 threshold signature scheme with recognizable abort properties, aiming to address the shortcomings of existing schemes in terms of efficiency and security. This scheme uses partial homomorphic encryption to encrypt and share private keys and random numbers. It optimizes Paillier encryption so that the signature generation process only requires the input of a message in the last round, significantly reducing the number of interaction rounds and improving the efficiency of the online signature phase. This scheme also employs SSS and Verifiable Secret Sharing (VSS) techniques to ensure secure key distribution in the key generation phase and realizes distributed computation of threshold signatures through multiparty additive sharing. In addition, the scheme designs a key update policy that allows dynamic adjustment of the threshold and the number of participants, making it more flexible for application scenarios in blockchain and cryptocurrency wallets. Compared with Canetti’s method [
87], although the communication overhead in the pre-signature phase grows linearly with the number of participants, this scheme is significantly optimized in terms of communication and computation overheads and still offers better performance and scalability in general.
In the same year, Li et al. [
130] proposed a fast two-square scheme using MtA functionality named 2SM2. This paper is constructed based on the re-sharing and MtA protocols, with multiplicative sharing of private keys and additive sharing of random numbers. They design and implement a non-interactive online signing phase and the Paillier-based MtA protocol is required only once in the scheme, improving efficiency. Finally, they proved the security of the scheme under the ROM of a static adversary.
In 2024, Liu et al. [
131] organized a two-party SM2 signature summary based on the previous schemes, introduced it based on multiplicative and additive splitting of the private key, included a variety of methods for morphing random numbers and keys, and gave a security and performance analysis. Finally, we give a comparison of multiple two-party SM2 signatures in Tab.7.
8 Relevant applications
Threshold signatures play important roles in many fields and applications, such as blockchain [
77], cloud computing [
132], and Internet of Things (IoT) [
133]. Threshold ECDSA, one of the earliest schemes is applied in blockchain [
76,
77,
134]. There have been many studies in the literature on the application of threshold ECDSA in blockchain, and its application scenario is mainly to protect the security of blockchain transactions. Different from ECDSA signatures, Schnorr and EdDSA are threshold friendly and their threshold versions require less extra overhead, so now many blockchain systems have started to use EdDSA as a threshold signature algorithm gradually [
135,
136]. Bilinear pair-based threshold algorithms are favoured in blockchain as they are more efficient in signature aggregation when the number of operators is large [
107,
110,
134]. In addition, the thresholding of IBS can also deal with the problem of MSK (mater secret key) leakage in IoT.
Schnorr signatures can be used for consistency protocols in distributed systems [
42], such as consensus mechanisms in distributed ledger technology. The system can coordinate and agree among multiple nodes through threshold signatures and signature aggregation. EdDSA is used in certain security protocols, such as distributed authentication and authorization systems [
75], which allow multiple authorized parties to generate valid signatures. It can also be used for collaborative signatures in distributed applications, such as smart contract execution in blockchain systems [
73,
137], which allows multiple participants to co-sign transactions or contracts.
9 Potential directions
In this paper, we have not presented the thresholding of RSA signatures and Post-Quantum (PQ)-based signatures, mainly because there are still many aspects in the development of schemes on these two types of threshold signatures that need to be addressed in the investigation of thresholding.
RSA algorithm is a public key algorithm constructed based on the difficulty of the large number decomposition problem and is one of the most commonly used public key encryption and digital signature algorithms in the world. Earlier researchers have proposed more threshold RSA signature schemes that can achieve active security in different schemes [
138], adaptive security [
139], etc.
Frederiksen et al. [
140] proposed a relatively two-party RSA key generation scheme, however, the scheme requires multiple uses of the Paillier homomorphic encryption algorithm, which imposes high communication and computational overheads, and it cannot be scaled up to a multi-party framework.
In 2023, Tessaro et al. [
141] proposed a scheme based on linear Hash functions that achieves security while relying only on the discrete logarithm assumption and the RSA assumption, and partially realizes non-interactivity. By generalizing the existing FROST [
38] and MuSig2 [
56] schemes, the authors construct a two-round protocol where the first round of messages is independent of the signature content, greatly improving the practicality. The scheme simplifies the computation by replacing the exponential mapping with a linear mapping via a linear hash function and proves security under the Algebraic One-More Preimage Resistance (AOMPR). In addition, the authors address the difficulties in the computation of Lagrange coefficients and modulo inverse operations in RSA threshold signatures and propose a concrete implementation based on discrete logarithms and RSA under the stochastic predicate machine model.
Overall, the functional design of threshold RSA signatures has been well studied, but researchers in the distributed key generation and security part need further research, such as finding the inverse in the case of modulo non-disclosure
Existing public-key cryptography relies on mathematically intractable problems that are currently considered intractable on conventional computers and require significant computational time. There are already well-developed systems in conventional threshold signatures, and there are many schemes that satisfy different levels of security assumptions and implement many features. However, with the advent of quantum computers, with the help of Shor’s algorithm [
142], there is a potential to crack the large integer decomposition problems in a shorter period of time. Discrete logarithmic problems and elliptic curve problems are also at risk under the attack of quantum computers.
To cope with quantum computer attacks, researchers have started working on PQ-based signatures. In 2017, NIST began to collect proposals for PQ-based signature schemes, and finally, three schemes [
143−
145] were selected for digital signatures.
In addition to the schemes standardized by NIST, nowadays anti-quantum signature algorithms can be classified into lattice-based, hash-based, multivariate quadratic-based, and homology-based. Several researchers have successively proposed different methods to instantiate the threshold signature based lattice [
146,
147]. Next post-quantum threshold signatures will become the focus of future research, and how to ensure the security of transmission in the interaction is also one of the difficulties of research.
Instead, research on EdDSA can focus on how to design high-performance threshold signature schemes that comply with standards, such as high efficiency and low communication. Research on ECDSA threshold signature can focus on designing a weight plus fault tolerance scheme to satisfy scenarios such as board voting, which can be accomplished when the weights are large enough. In addition to that, consider how to remove the bulletin board while implementing IA and proactive functionalities.
The Author(s) 2025. This article is published with open access at link.springer.com and journal.hep.com.cn