Placing Trust in Whonix ™
Trust is a very problematic issue. This is the essence of why security is difficult in every field, including general computing and Internet communication. A skeptical user might ask themselves the following questions before relying upon Whonix ™ for sensitive activities on a daily basis:
- Can Whonix ™ and its developers be trusted?
- Are backdoors present in Whonix ™ that can take control over a computer or exfiltrate data?
- Does Whonix ™ generate compromised encryption keys to enable government spying?
- How trustworthy and sincere are the stated anonymity goals of the Whonix ™ project?
Opinions will vary widely, but the reasoning process used to reach the conclusion should be closely examined. It is important that both trust and distrust are based on facts, and not gut feelings, instincts, paranoid conceptions, unfounded hearsay or the words of others.
It is unsurprising that the Whonix ™ project and other anonymity platforms / tools claim to be honest, but written assurances are worthless. For an informed decision, it is worth looking at the bigger Whonix ™ picture: core components, affiliations, project track record, and how reasonable trust might be established.
Freedom Software and Public Scrutiny
Whonix ™ and other Freedom Software makes it possible to check the source code to determine how a software distribution functions and what it consists of. Suitably skilled individuals can thoroughly audit the code to search for the presence of any malicious code, like a backdoor. In addition, software can be manually built from source code and the result compared against any versions that are pre-built and already being distributed, like the Whonix ™ ova images that can be downloaded from . This comparison can determine whether any malicious changes were made, or if the distributed version was actually built with the source code.
Naturally most people do not have the requisite knowledge, skills or time to properly audit software. However, the public scrutiny of popular, open source software implies a certain degree of trustworthiness. The axiom attributed to Linus Torvalds  -- "Given enough eyeballs, all bugs are shallow" -- is a reasonable assumption in user communities that are large, vibrant, and focused on fixing security vulnerabilities quickly.  The Freedom Software community has a strong tradition of publicly reporting and resolving serious issues, and a large pool of developers and beta testers can help to identify and remedy problems. 
The opposite of Freedom Software is non-freedom software. Freedom Software provides strong advangages over non-freedom software, which should be avoided. The case for Freedom Software is made on the avoid non-freedom software wiki page.
Table: Finding Backdoors in Freedom Software vs Non-Freedom Software
|Non-Freedom Software (precompiled binaries)||Freedom Software (source-available)|
|Can view original source code||No||Yes|
|Compiled binary file can be decompiled into disassembly||Yes||Yes|
|Regular pre-compiled binaries.||Depends. Some use binary obfuscators.||Yes|
|Usually not using obfuscation [archive] (anti-disassembly, anti-debugging, anti-VM )||Depends. Some use.||Yes |
|Price for security audit looking for backdoors||very high ||lower|
|Difference of precompiled version versus self-compiled version||unavailable ||small or none |
|No requirement for reverse-engineering [archive]||No||Yes|
|Assembler language skills required||much more||less|
|Always legal to decompile / reverse-engineer||No  ||Yes |
|Possibility catching backdoors through observing incoming and outgoing internet connections||very difficult ||very difficult |
|Convenience of spotting backdoors||lowest convenience ||very high convenience |
|Difficulty of spotting a "direct" backdoors   ||much higher difficulty ||much lower difficulty |
|Difficulty of spotting a bugdoor ||very much higher difficulty ||lower difficulty|
|Third parties can legally software fork [archive], release a patched version without the backdoor||No ||Yes |
|Third parties can possibly make (possibly legally questionable) modifications such as disabling serial key checks ||Yes||Yes|
|Can always modify the software||No ||Yes|
|Third parties can use static code analysis tools||No||Yes|
|Third parties can judge source code quality||No||Yes|
|Third parties can find logic bugs in the source code||No||Yes|
|Third parties can find logic bugs in the disassembly||Yes||Yes|
|Can benefit from worldwide wisdom of the crowd||No||Yes|
|Third parties can benefit from debug symbols [archive] during analysis||Depends. Some may publish debug symbols.||Yes|
|Display source code intermixed with disassembly||No||Yes |
|Effort to audit subsequent releases||almost same ||usually lower |
|forum discussion [archive]|
Spotting backdoors is already very difficult in Freedom Software where the full source code is available to the general public. Spotting backdoors in non-freedom software, obfuscated binaries is much exponentially more difficult.        
Reproducible builds are a set of software development practices that create an independently-verifiable path from source to binary code.
Whilst anyone may inspect the source code of free and open source software for malicious flaws, most software is distributed pre-compiled with no method to confirm whether they correspond.
This incentivises attacks on developers who release software, not only via traditional exploitation, but also in the forms of political influence, blackmail or even threats of violence.
This is particularly a concern for developers collaborating on privacy or security software: attacking these typically result in compromising particularly politically-sensitive targets such as dissidents, journalists and whistleblowers, as well as anyone wishing to communicate securely under a repressive regime.
Whilst individual developers are a natural target, it additionally encourages attacks on build infrastructure as an successful attack would provide access to a large number of downstream computer systems. By modifying the generated binaries here instead of modifying the upstream source code, illicit changes are essentially invisible to its original authors and users alike.
The motivation behind the Reproducible Builds project is therefore to allow verification that no vulnerabilities or backdoors have been introduced during this compilation process. By promising identical results are always generated from a given source, this allows multiple third parties to come to a consensus on a “correct” result, highlighting any deviations as suspect and worthy of scrutiny.
This ability to notice if a developer has been compromised then deters such threats or attacks occurring in the first place as any compromise would be quickly detected. This offers comfort to front-liners that they not only can be threatened, but they would not be coerced into exploiting or exposing their colleagues or end-users.
Trusting Debian GNU/Linux
Nearly all the software shipped in Whonix ™ comes from the Debian GNU/Linux distribution [archive]. Debian's packages are heavily scrutinized as it is one of the largest Linux distributions [archive] at present. Debian is also one of the most popular distributions for derivative platforms; Ubuntu Linux [archive] is a Debian derivative, and the same applies to all Ubuntu derivatives such as Linux Mint [archive].
The sheer number using Debian's software packages and the large developer pool inspecting software integrity are significant factors in Debian's favor. Debian regularly identifies and patches serious security issues [archive] like the infamous SSH PRNG vulnerability , but backdoors or other purposeful security holes have never been discovered to date. Debian's focus on security is further evidenced by their Security Audit team which constantly searches for new or unfixed security issues. 
Whonix ™ anonymity is based on Tor, which is developed by The Tor Project [archive]. Tor is a mature anonymity network with a substantial user base, and it has developed a solid reputation after more than 15 years of development. Tor's distributed trust model makes it difficult for any single entity to capture a user's traffic and identify them on a consistent basis.
Tor and its general development are subject to heavy public scrutiny by academics, security professionals and a host of developers.  For example, there is a body of Tor research related to potential attack vectors on onion routing and the adequacy of current defenses, and the source code has undergone several external audits. Like any software project, numerous security issues have been identified and resolved over the years, but a purposeful backdoor has never been discovered.  Theories about deliberate backdoors in Tor are considered highly speculative and lacking any credible basis.
Trusting Whonix ™
In one sense, Whonix ™ is the simple union of Debian and Tor and a mechanism to glue them together. If a user already trusts Debian and The Tor Project, then a method for assessing Whonix ™ trustworthiness is also necessary.
The Whonix ™ project was founded on 11 January, 2012. It previously existed under different project names, including TorBOX and aos. As mentioned earlier, Whonix ™ is Freedom Software which makes the source code available for inspection. In the main, Whonix ™ is comprised of specifications for which Debian software packages should be installed and their appropriate configuration. See also this list of notable reviews and feedback about the security of Whonix ™.
With a relatively small development team and estimated user base, the "many eyeballs" theory may work against Whonix ™ at present. However, the source code is comparably small and devoid of complexities, meaning the project is in relatively good shape compared to many other similar projects. Interested readers can learn more about the Whonix ™ specification and design here. 
With these factors in mind, the reader can now make an informed decision about the trustworthiness of Whonix ™.
Whonix ™ Warrant Canary
The Whonix ™ warrant canary [archive] is intended to provide a means of communication to users in the event Whonix ™ is served with a secret subpoena, despite legal prohibitions on revealing its existence. For any canary in force, once the signature of the canary file is verified with OpenPGP, this confirms that no warrants have been served on the Whonix ™ project.
Note: the canary date of issue is represented by the gpg signature date. A new canary should be released within 4 weeks. 
The canary and signature are available here:
- Canary text file: https://download.whonix.org/whonixdevelopermetafiles/canary/canary.txt [archive] (v3 onion [archive])
- OpenPGP signature: https://download.whonix.org/whonixdevelopermetafiles/canary/canary.txt.asc [archive] (v3 onion [archive])
As a backup, the canary and signature are also available on github: 
Readers are reminded this canary scheme is not infallible. The canary declaration is provided without any guarantee or warranty, and it is not legally binding upon any parties in any form. The signer should never be held legally responsible for any statements made in the canary.
Trusting Downloaded Images
Users should not blindly trust the Whonix ™ project or its developers. Logically it is unwise to trust unknown persons, especially on the Internet. On that basis, trust in Whonix ™ founder Patrick Schleizer should not rely on his public persona or the appearance of the Whonix ™ project alone. Whonix ™ may be or could become a high profile target, and it is risky to assume that Schleizer's build machine would remain clean under those circumstances.
Binary images can be trusted to some extent if a user verifies that they received exactly the same code as thousands of other users, and no one has found or publicly reported any serious security issues. This requires verification of the Whonix-Workstation ™ and Whonix-Gateway ™ images using the available OpenPGP signatures.  All binary releases and source code tags for releases are OpenPGP-signed by lead Whonix ™ developer Patrick Schleizer.
In order of increasing security, the Whonix ™ images can be:
- Downloaded via . TLS provides some trust and integrity of the hash file, but it is still advisable to check the site's certificate and perform manual OpenPGP verification.
- Downloaded over the Whonix ™ v3 onion address [archive] with Tor Browser before OpenPGP verification. Onion addresses provide a higher standard of authentication than clearnet addresses.
- Built from source since it is a relatively easy procedure. 
Trusting the Whonix ™ Website
Web Application Shortcomings
As noted in the Privacy on the Whonix ™ Website chapter, three separate web-based platforms are currently in use:
- Discourse [archive] for the Whonix ™ forums.
- MediaWiki [archive] for online documentation.
- Phabricator [archive] (mostly) for the Whonix ™ project's issue/bug tracker.
The problem is these web applications (web apps) are developed independently from Whonix ™. This means Whonix ™ developers have little to no control over the course these projects take. Since privacy and security issues often take a back seat to "enhanced features", websites relying on these or similar web apps can at best only provide privacy by policy, which is equivalent to a promise.
It is infeasible from a monetary, time and manpower perspective to address perceived shortcomings in these web apps. This means the Whonix ™ community should not place undue trust in the live version of this site on the Internet, due to the potential for interference.
In an identical fashion to the Qubes project, Whonix ™ has adopted the principle that all infrastructure should be explicitly distrusted. Infrastructure in this context refers to "...hosting providers, CDNs, DNS services, package repositories, email servers, PGP keyservers, etc."
Third parties who operate infrastructure are "known unknowns" and potentially hostile. It is safer to voluntarily place trust in a few select entities, such as the contributors of Whonix ™ packages, the holder(s) of Whonix ™ signing keys and so on. By sufficiently securing endpoints, it is unnecessary to try and improve the trustworthiness of those operating the "mid-points". This also provides two benefits: Whonix ™ forgoes the need to invest valuable resources on the problem, and no illusory security expectations are raised in the Whonix ™ community.
What does it mean to “distrust the infrastructure”?
A core tenet of the Qubes philosophy is “distrust the infrastructure,” where “the infrastructure” refers to things like hosting providers, CDNs, DNS services, package repositories, email servers, PGP keyservers, etc. As a project, we focus on securing endpoints instead of attempting to secure “the middle” (i.e., the infrastructure), since one of our primary goals is to free users from being forced to entrust their security to unknown third parties. Instead, our aim is for users to be required to trust as few entities as possible (ideally, only themselves and any known persons whom they voluntarily decide to trust).
Users can never fully control all the infrastructure they rely upon, and they can never fully trust all the entities who do control it. Therefore, we believe the best solution is not to attempt to make the infrastructure trustworthy, but instead to concentrate on solutions that obviate the need to do so. We believe that many attempts to make the infrastructure appear trustworthy actually provide only the illusion of security and are ultimately a disservice to real users. Since we don’t want to encourage or endorse this, we make our distrust of the infrastructure explicit.
Self-Hosting vs Third Party Hosting
Some users mistakenly believe that servers of security-focused projects are virtually impenetrable and hosted in the homes of developers; this is not the case. The
whonix.org server is actually hosted at an Internet hosting company. Similarly, The Tor Project [archive] and Tails [archive] servers are not hosted in a developer's home either. Hosting at home is the exception, rather than the rule. At the time of writing, there are no known cases where servers are hosted in a developer's home. This means employees of the associated Internet hosting company have physical access rights to the server, along with any other capable, malicious actors.
Since virtually every project is hosted by a third party (an Internet hosting company), the capability to physically secure server hardware is largely forfeited. Without physical security and due to the risk of untrusted visitors, a hardware backdoor could easily compromise the security of the server.
Any demand that servers ought to be super secure and hosted in a developer's home is idealistic. Home Internet connections are generally too slow to meet the requirements of a public web server in terms of traffic quota and connection upload speed. Internet service providers (ISPs) do not usually allow a busy public web server to be hosted on home connections; throttled connections or terminated contracts are likely if that happens.
The "proper solution" would require purchase of a business Internet uplink, similar to becoming an Internet hosting company. This would incorporate a business building with a good Internet uplink, full camera security, security officers and so forth. Unfortunately this is economically infeasible at the current stage of project development.
Many web applications in use by
whonix.org did not provide software signatures at the time of installation or still do not provide them. Therefore, in stark contrast to software installed by default in Whonix ™, for the
whonix.org server it was not possible to always enforce verification of software signatures.
Many web application and extensions updaters did not, or still do not, securely verify software signatures. Therefore, the security level of most servers is probably only equivalent to
plaintext. In the case of the
whonix.org server, the system security level is only equivalent to
always use TLS and not
always use software signatures verification.
In the past, various suggestions for "perfect server privacy"  were made such as "self-hosting in developers' homes" or "host the server outside the five eyes [archive] (nine eyes, fourteen eyes [archive]) countries". Despite the good intentions, these suggestions do not easily translate into an actionable plan.
First, these suggestions assume there is a sane method of rating the privacy protections afforded by a specific country. Moreover, the privacy rights granted for local citizens in a specific jurisdiction do not necessarily extend to non-citizens. Whonix ™ developers are unaware of any project that rates privacy protections in this way, considers the feasibility of operating servers (by running tests), and then makes recommendations for locations which provide the best possible privacy.
In today's world following the Snowden disclosures, it has to be assumed that if surveillance is possible it is being done. The likelihood is that surveillance is undertaken in all jurisdictions, and it is only a matter of degree.
Even The Tor Project -- a much older, established and better funded organization -- does not attempt to implement any suggestion concerning "perfect server privacy". As noted on their sponsor's page [archive]:
Fastly generously hosts our Tor Browser update downloads that can be fetched anonymously.
Fastly [archive] is providing content delivery network (CDN) services and is headquartered in America (arguably the most aggressive member of the five eyes network). Even Debian uses CDNs Amazon AWS and Fastly [archive].
In a similar fashion to the Distrusting Infrastructure chapter, Whonix ™ has concluded it is not worthwhile investing valuable resources to try and provide "perfect server privacy", because it is simply uneconomical. For this reason, the viewpoint that no undue trust should be placed in the server arrangements is made explicit.
Server security issues should not be conflated with software security issues. If an advanced adversary wanted to tarnish the reputation of any security-focused project, then breaking into the data center where it was hosted and "hacking" them would be one way to achieve that aim. Projects that are honest need to mention this possibility beforehand, so it is not unexpected.
The world's largest and most profitable technology companies like Google, Facebook, Microsoft and Amazon can easily afford to employ large, dedicated and skilled teams of system administrators to work around the clock to protect their servers.  For small projects, this scale of server protection is completely unrealistic.
Software Update APT Repository Security
A compromise of the
whonix.org server would not result in a compromise of users attempting to upgrade Whonix ™. This is because of a standard APT security feature, digital signature verification of APT repository metadata (SecureApt [archive]). An adversary who compromised the
whonix.org server would lack the signing key required to generate valid signed APT repository metadata. Invalid APT repository metadata would be rejected by the user's APT updater software. APT repository metadata is signed locally on the developer's computer before the signed metadata is uploaded to the
whonix.org server. The Whonix ™ APT repository signing key is never exposed to the server. Thanks to digital signature verification the Whonix ™ software update APT repository can be in theory considered more secure than than the
Software Build Process Security
A compromise of the
whonix.org server would not result in a compromise of software packages (image downloads, Debian packages,
.debs) because all software is built locally on the developer's computer. No binary builds offered to download for users of software developed under the Whonix ™ umbrella is ever created on remote servers hosted by third parties. Users who always correctly verify software signatures could detect malicious software before use. However, note these Consequences of Server Compromise.
Server Privacy vs Server Security
In an ideal world, both server privacy and server security would be maximized at the same time. However, in the real world this is an impossibility.
In a world with specialization and division of labour, those companies who excel at hosting web applications have more focus, time, energy, knowledge and money to work on server security; it is their raison d'etre (reason for being). In contrast, small projects use web applications only as a means to an end. Therefore, using third party web application hosters may provide better security than self-hosting, but better server privacy demands self-hosting. This means it is impossible to optimize both security and privacy simultaneously; the goals are at odds with each other.
The almost perfect uptime of popular web services such as Google, Facebook, and Amazon (perhaps 99.99 per cent) might lead some to conclude this is an easy goal to achieve; this is a false assumption.
Expecting the same uptime from much smaller projects like Whonix ™ is unrealistic. At best, maybe only 99.0 per cent uptime can be provided because no resources are spent on server uptime statistics, server upgrades need to be performed, and reboots are necessary. These factors necessarily lead to downtime when the website is unavailable. With a huge budget it would be possible to approach the 99.99 per cent uptime that popular websites have via technical solutions such as server farms, load balancing, and failover [archive], but this is infeasible for small projects. Similarly, large companies can afford to pay for whole teams of system administrators who are working 24/7, in concert with these technical options. Again, small projects do not have that option.
Finally, server downtime is not evidence of a server compromise, but normally relates to server issues (for example, failing hard drives) and routine server maintenance.
Consequences of Server Compromise
A compromised whonix.org would result in one of multiple of the following issues:
- Give bad, malicious advice to visitors of the
- Offer malicious software downloads to users who do not always verify software signatures.
- Unavailability of software downloads and updates.
- Reputational damage for Whonix ™.
Due to the multiple issues outlined in this section, the software produced by the Whonix ™ project is theoretically considered more secure than the website provided by the Whonix ™ project (
whonix.org). The Whonix ™ software is the main product delivered by the Whonix ™ project, while the
whonix.org server is only a tool to document and deliver Whonix ™.
Most users retrieve OpenPGP fingerprints directly from a website and then download an associated key from a key server. The problem with this method is that TLS is fallible and the connection could be insecure or broken. Greater security necessitates a key signing party, whereby a direct and trusted path of communication can be confirmed by all attendees. If this step is not followed, OpenPGP is only secure as TLS.
It is often impossible to meet this condition of meeting in person. To mitigate the risk, any OpenPGP fingerprint should be cross-referenced on multiple "secure" (
https://) sites. An additional fail-safe is to use an alternative authentication system, for example comparing the Tor signing keys on both the clearnet and onion domains: https://www.torproject.org/docs/signing-keys.html [archive] and http://expyuzz4wqqyqhjn.onion/docs/signing-keys.html [archive]
Onion services offer strong authentication via multiple layers of encryption [archive]. This does not prohibit an advanced adversary from trying to impersonate an onion service, but together with multiple fingerprint sources, it becomes increasingly difficult and improbable that a single entity could impersonate them all.
Whonix ™ Binaries and Git Tags
All Whonix ™ binaries are OpenPGP-signed by Whonix ™ developer Patrick Schleizer.  The source code is directly available on github over TLS, and it can be cloned using git over
https://. Git tags for each release are also OpenPGP-signed by Whonix ™ developer Patrick Schleizer. Users can also request signed git development tags from the same developer.
Even if Whonix ™ developers are distrusted, verifying binary downloads or git tags with OpenPGP is still useful. For example in order to audit Whonix ™, it is important to verify the download came from Whonix ™ developers and that it was not tampered with by third parties. This is a realistic threat, as these recent examples show:
- An attacker could modify source codes on github [archive] (w [archive])
- sourceforge hacked [archive] (w [archive])
- sourceforge mirror hacked [archive] (w [archive])
The OpenPGP key also ensures that if the Whonix ™ infrastructure is ever compromised by a powerful adversary (such as a domain takeover), the original Whonix ™ developers can at least prove they owned the infrastructure.
Whonix ™ Developer OpenPGP Guidelines
All long-term Whonix ™ developers are encouraged to:
- Create a 4096/4096 RSA/RSA OpenPGP key.
- Retrieve the latest which comes with Whonix-Workstation ™ for stronger hashes, no-emit-version, and other improved settings.
- Store the private key inside an encrypted file.
- Make a backup of that encrypted file.
- Remember the password and regularly test one's memory of it.
- Upload the encrypted file to a (free) online cloud-based host to protect against theft, fire, natural events and so on.
From the beginning of the Whonix ™ project, greater trust has been placed in developers who publish their OpenPGP public key earlier on, since this reduces the probability of an evil developer attack.
Verifiable .ova Releases
Whonix previously had a feature which allows the community to check that Whonix .ova  releases are verifiably created from the project's own source code - verifiable builds.  This only proves that the person and machine  building Whonix have not added anything malicious, such as a backdoor.  It does not prove there are no backdoors present in Debian. This is not possible, because neither Debian  nor any other operating system provides deterministic builds yet. 
This feature does not attempt to prove there are not any vulnerabilities present  in Whonix or Debian. Fatal outcomes are still possible via a remotely exploitable  bug in Whonix or Debian, a flaw in Whonix's firewall which leaks traffic, or code phoning home  the contents of the HDD/SSD. Community effort is a precondition to improved security with this feature, particularly auditing of Whonix and Debian source code to check for possible backdoors and vulnerabilities.
In summary, this feature is useful and potentially improves security, but it is not a magical solution for all computer security and trust issues. The following table helps to explain what this feature can achieve.
Table: Verifiable Builds Comparison
|Whonix ™||Tails||Tor Browser||Qubes OS TorVM||corridor|
|Deterministic builds ||No||No (planned) ||Yes ||No||Not applicable |
|Based on a deterministically built  operating system||No ||No ||Not applicable||No ||No |
|Verifiably no backdoor in the project's own source code||Invalid ||Invalid ||Invalid ||Invalid ||Invalid |
|Verifiably vulnerability-free [archive]||No ||No ||No ||No ||No |
|Verifiably no hidden source code  in upstream distribution / binaries ||No ||No ||No ||No ||No |
|Project's binary builds are verifiably created from project's own source code (no hidden source code  in the project's own source code)||No (deprecated) ||No||Yes||No||Not applicable |
Some readers might be curious why Whonix was previously verifiable, while Debian and other distributions are not. In short, this is because Whonix is uncomplicated by comparison. In simple terms, Whonix is a collection of configuration files and scripts, and the source code does not contain any compiled code and so on. In contrast, Debian is a full operating system, without which Whonix would not exist. 
This feature was first made available in Whonix 8. Only users who download a new image can profit from this feature.  It is not possible to audit versions older than Whonix 8 with this script. 
This is only an an introduction to this topic; see Verifiable Builds for full details.
Verifiable Whonix ™ Debian Packages
This has been deprecated because it is difficult to implement before the experimental, Debian reproducible toolchain is merged into the stable release.  For full details on this topic, see Verifiable Whonix ™ Debian Packages.
Whonix ™ Updates
An optional updater has been available in Whonix ™ since version 6 of the platform.  When it comes to trust, there is a large difference between building Whonix ™ from source code and using the Default-Download-Version.
APT Repository and Binary Builds Trust
When Whonix ™ is built from source code using the build script and the source code is audited by the builder to be non-malicious and reasonably bug-free, Whonix ™ developers are unable to access the system. On the other hand, if Whonix ™ APT repository is enabled, developers holding a Whonix ™ repository signing key could release a malicious update to gain full access to the machine(s). 
Even if the Whonix ™ APT repository is not used with the Default-Download version, it is still theoretically possible for Whonix ™ developers to sneak a backdoor into the binary builds which are available for download.  Although an unpleasant threat, using Whonix ™ APT repository poses a greater risk: a malicious Whonix ™ developer might sneak in a backdoor at any time.
It is easier to sneak backdoors into binary builds, since they contain compiled code in binary packages which are downloaded from the Debian repository when built.
APT Repository Default Settings
- Building from source code: Whonix ™ APT Repository is disabled by default. 
- Default binary download: Whonix ™ APT Repository is enabled by default.
- Qubes/Install: Whonix ™ APT Repository is enabled by default.
- Building from source code: Whonix ™ APT Repository is enabled by default. 
Most users will have the Whonix ™ APT repository enabled. This means when updated Whonix ™ debian packages are uploaded to the Whonix ™ APT repository, these packages will be automatically installed when the system is upgraded.  If this behavior is unwanted, this can be disabled. Refer to the previous section outlining security implications before proceeding.
- *: poor security.
- ****: best security.
Table: Build and APT Repository Security Comparison
|Binary Download with Whonix ™ APT Repository||Binary Download without Whonix ™ APT Repository||Built from Source Code and Whonix ™ APT Repository Enabled||Built from Source Code and Whonix ™ APT Repository Disabled|
- The Whonix ™ binary download using the Whonix ™ APT repository is the most convenient method, but also the least secure.
- It is somewhat safer to use the Whonix ™ binary download and then disable the Whonix ™ APT repository. However, the user must then manually download updated Whonix ™ deb packages upon release, and independently verify and install them.
- The greatest security comes from building Whonix ™ and updated packages from source code, particularly if the source code is verified before building Whonix ™.
What Digital Signatures Prove
See Verifying Software Signatures for details on what digital signatures prove.
In short, a user must be careful to ensure the public keys that are used for signature verification are the Whonix ™ key pair belonging to the Whonix ™ developer of the component specific component. At time of writing there are two different components and signing keys.
Evil Developer Attack
An "evil developer attack" is a narrow example of an insider threat: 
Software development teams face a critical threat to the security of their systems: insiders.
An insider threat is a current or former employee, business partner, or contractor who has access to an organization’s data, network, source code, or other sensitive information who may intentionally misuse this information and negatively affect the availability, integrity, or confidentiality of the organization’s information system.
In the case of software, a disguised attack is conducted on the integrity of the software platform. While this threat is only theoretical, it would be naive to assume that no major software project has ever had a malicious insider. Whonix ™ and all other open source software projects face this problem, particularly those that are focused on anonymity such as VeraCrypt,  Tails, I2P, The Tor Project and so on.
A blueprint for a successful insider attack is as follows:
- Either start a new software project or join an existing software project.
- Gain trust by working hard, behaving well, and publishing your sources.
- Build binaries directly from your sources and offer them for download.
- Attract a lot of users by making a great product.
- Continue to develop the product.
- Make a second branch of your sources and add malware.
- Continue to publish your clean sources, but offer your malicious binaries for download.
- If undetected, a lot of users are now infected with malware.
An evil developer attack is very difficult for end users to notice. If the backdoor is rarely used, then it may remain a secret for a long time. If it was used for something obvious, such as adding all the users to a botnet, then it would be quickly discovered and reported on.
Open source software has some advantages over proprietary code, but certainly not for this threat model. For instance, no one is checking if the binaries are made from the proclaimed source and publishing the results, a procedure called "deterministic builds".  This standard is quite difficult to achieve, but is being worked towards. 
While most security experts are focused on the possibility of a software backdoor, other insider attacks can have equally deleterious effects. For instance, the same methodology can be used to infiltrate a targeted project team but in a role unrelated to software development; for example, as a moderator, site administrator, wiki approver and so on. This approach is particularly effective in smaller projects that are starved of human resources.
Following infiltration, disruption is caused within the project to affect productivity, demoralize other team members and (hopefully) cause primary contributors to cease their involvement. For example, using a similar blueprint to that of the evil developer attack, a feasible scenario is outlined below:
- Join an existing software project as a general member.
- Gain trust by working hard, behaving well, assisting readily in forums, making significant wiki contributions and so on.
- Attract a lot of community admiration by outwardly appearing to be a bona fide and devoted project member.
- Eventually attain moderator, administrator or other access once team membership is extended. 
- Continue to behave, moderate and publish well.
- Once trust is firmly established, subtly undermine the authority, character and contributions of other team members. 
- If the insider threat is undetected for a significant period, this can lead to a diminished software product due to a fall in contributions in numerous domains and team ill will.
The insider threat nicely captures how difficult it is to trust developers or other project members, even if they are not anonymous. Further, even if they are known and have earned significant trust as a legitimate developer, this does not discount the possibility of serious mistakes that may jeopardize the user. The motives and internal security of everyone contributing to major software projects like Tor, distribution developers and contributors, and the hundreds of upstream developers and contributors is a legitimate concern. 
The trusted computing base of a modern operating system is enormous. There are so many people involved in software and complex hardware development, that it would be unsurprising if none of the bugs in existence were intentional. While detecting software changes in aggregate may be easy (by diffing the hash sums), finding and proving that a change is a purposeful backdoor rather than a bug in well designed source code is near impossible.
This is not legal advice. This is not discussing any law in particular. Theoretic considerations by non-lawyers.
From time to time users are asking if any contemporary discussed catastrophic law draft was to be passed. How the Whonix ™ would react in such cases. Laws which might ban end-to-end encryption, anonymity tools such as Tor or demand that operating systems must include a backdoor.
It is important to note that government members have diverse and conflicting interests and that bills hostile to internet privacy and security are introduced regularly, all the time, but often common sense prevails and such ill-proposed legislation stalls and dies without ever being passed. Bills that allocate funding to support cryptographic development and privacy tools garner support and are passed most of the time because most legislators understand the importance of them in an open society. While it is important for internet privacy advocacy groups to be ever vigilant, it is not productive to burn yourself out every time a braindead bill is discussed.
Assuming a privacy hostile law gets passed, first of all, perhaps counter intuitively, this not a Whonix ™ specific issue, although Whonix ™ would be affected too. Whonix ™, for a large part, is a compilation of existing software packages provided by third parties which allow re-use in a compilation due to permissive licensing (Freedom Software). In this context, noteworthy components which Whonix ™ relies on directly or indirectly are the base operating system (Debian at time of writing) and an anonymizer (Tor at time of writing). Such as law would very much likely harm the security properties of these and other projects first. To learn more about this organizational structure, see Linux User Experience versus Commercial Operating Systems.
During such discussions it is usually suggested, should such a law be passed, to relocate the legal entity to a different country. Such an option depends on the specifics of the text of the law. However, most likely there would be no such legal loopholes. Examples in which cases a legal entity re-location does or did not help include people who would like to sell controlled substances (such as medicine) or goods (such as weapons) without all authorizations required by law. Another example are financial services. This is also why unnamed stock certificates on the blockchain do not exist.
Some U.S. laws apparently apply worldwide. Kim Dotcom, a German/Finnish dual national, permanent resident of and physically present in New Zealand at the time of the alleged copyright infringement charges brought forth by the USA, had his assets seized, worldwide bank accounts frozen, got arrested and is fighting extradition to the USA. As Kim Dotcom elaborately summarized on twitter [archive].
I never lived there
I never traveled there
I had no company there
But all I worked for now belongs to the U.S.
It is also suggested to simply not comply with the law. However, this is most likely too much to ask. Most laws include an enforcement mechanism, though selectively applied depending on government interests. If the law is not being complied with, especially for repeat and continuous offenses, there is a penalty. This penalty would probably be a prison sentence. Even if it is a monetary fine, the ultimate end result for continuous refusal to pay a fine is seizure of assets, imprisonment, or worse.
Law enforcement has incredibly long arms. In most cases there is no way to openly defy the law for extended periods and getting away with it. To a large degree, policy issues cannot be fixed only through technology, but it must be combined with resistance on a political level. Resistance need not be forceful. Policies are affected by popular opinion, and if you support privacy-enhancing technologies you can help the cause by sharing your reasoned opinions with others. Casual supporters are important.
Maybe it would still be permitted to contribute Open Source code to Open Source projects. Maybe only the person redistributing binary builds to the public would be held personally accountable. All of this is speculation until a draft for such a new law catastrophic to security software solidifies.
What if Whonix ™ was forced to add a backdoor by law? Users would be notified and the project would be shut down before the law takes effect. Fortunately, there are no crazy law proposals yet that force one to continue running a backdoored project. Efforts might continue as a new project with a Linux distribution focusing on stability, reliability, documentation, recovery, and usability.
Other Projects Discussing Trust
- Tails is a live CD or USB that aims to preserve privacy and anonymity - Tails about trust. [archive] (w [archive])
- I2P (anonymizing network) has also discussed development attacks [archive]. (w [archive])
- Qubes OS: What do the Digital Signatures Prove and What They DO NOT Prove [archive] (w [archive])
- Miron’s Weblog: Attack Scenarios on Software Distributions [archive] (w [archive])
- A list of incidents concerning compromised servers: On distributing binaries [archive] (w [archive])
- Creator of the Linux kernel.
- https://www.govtechworks.com/open-source-is-safe-but-not-risk-free/ [archive]
- On the flip-side, there is no guarantee that just because software is open to review, that sane reviews will actually be performed. Further, people developing and reviewing software must know the principles of secure coding.
- https://resources.infosecinstitute.com/topic/anti-disassembly-anti-debugging-and-anti-vm/ [archive]
- An Open Source application binary could be obfuscated in theory but depending on the application, the context (it's not an Open Source obfuscators) that would be highly suspicious. An Open Source application using obfuscators would probably be criticized in public, get scrutinized, loose user trust.
Because for non-freedom software which is usually only available as pre-compiled, possibly obfuscated binary (using an anti-decompiler):
- auditors can only look at the disassembly and cannot compare a pre-compiled version from the software vendor with a self-compiled version from source code.
- there is no well written, well commented, easily readable by design source code.
- Since there is no source code, one cannot self-build one's own binary.
- small: for non-reproducible builds (or reproducible builds with bugs)
- none: for reproducible builds
- License agreements of proprietary software often expressively forbid decompilation.
- Skype used DMCA (Digital Millenium Copyright Act) to shut down reverse engineering of Skype [archive]
- Decompilation is always legal, permitted in the license agreements of Freedom Software.
- This is very difficult since nowadays by default most outgoing connections are encrypted by default. At some point the content must be available to the computer unencrypted, in plain text, but accessing that is not trivial. When running a suspected malicious application, one cannot trust local traffic analyzers such as wireshark since the malicious application might have compromised the host operating system and hiding that information from the traffic analyzer or through a backdoor. An option might be running the application inside a virtual machine but many malicious applications actively attempt to detect virtual machines and if detected, avoid doing malicious things to avoid detection. Ultimately this might be possible, but very difficult.
- One has to decompile the binary and read "gibberish" or try to catch malicious traffic originating from the software under review. How many people decompiled for example Microsoft Office and kept doing that for every upgrade?
- Audit the source code to be free of backdoors.
- Compare the precompiled binary with a self-build binary, audit the difference. Ideally, and in future, no difference (thanks to reproducible builds project) or small difference (due to non-determinism introduced during compilation such as timestamps).
- "direct" backdoor: Such as a hardcoded username and password or login key only known by the software vendor. No plausible deniability for the software vendor.
- List of “direct” backdoors in wikipedia [archive].
One interesting “direct” backdoor was this bitcoin copay wallet backdoor.
If more than 100 BTC, steal it. Otherwise, don’t bother.
- https://www.synopsys.com/blogs/software-security/malicious-dependency-supply-chain/ [archive]
- https://github.com/dominictarr/event-stream/issues/116 [archive]
- https://github.com/dominictarr/event-stream/issues/116#issuecomment-441759047 [archive]
- Requires strong disassembly auditing skills.
- If for example hardcoded login credentials where in the published source code, that would be easy to spot. If the published source code is different from the actual source code used by the developer to compile the binary, that difference would stand out when comparing pre-compiled binaries from the software vendor with self-compiled binaries from by an auditor.
- bugdoor: A vulnerability that can be abused to gain unauthorized access. Provides plausible deniability for the software vendor. See also Obfuscated C Code Contest [archive].
- Such issues are hard to spot in the source code but even harder to spot in the disassembly.
- Forbidden in license agreement. Due to lack of source code, no serious development is possible.
- Since source code is already available under a license that permits software forks and redistribution.
- This entry is to differentiate from above legally software fork. Precompiled proprietary software is often modified by third parties such as for purposes of privacy, game modding, exploitation.
- For example, Intel ME could not be disabled in Intel CPUs yet, neither is there a Freedom Software re-implementation of Intel Microcode at time of writing.
objdump[archive] with parameter
- How does objdump manage to display source code with the -S option? [archive]
- One could review the disassembly but for subsequent releases that’s duplicating the effort. The disassembly isn’t optimized to change as little as possible or to be human understandable. If the compiled added new optimizations, compilation flags changed, that creates a much bigger diff [archive] of the disassembly.
- After the initial audit of a source-available binary, one can follow changes of the source code. To audit any newer releases, an auditor can compare the source code of the initially audited version with the new version. Unless there was a huge code refactoring or complete rewrite, the effort the audit subsequent versions is lower.
The assembler low level [archive] programming language is more difficult than other higher level abstraction [archive] programming languages according to most people saying discussing it on the internet. Example web search terms:
Source code written in higher level abstraction programming languages such as C and C++ are compiled to object code [archive] using a compiler. See this article [archive] for an introduction and this image [archive].
Source code written in lower level abstraction programming language assembler is converted to object code using an assembler. See same article and this image [archive].
Given a reasonably complex program that was written in C or C++, where the source code is unavailable, reverse engineering is very difficult. That can be deducted from the high price for it. It is possible decompile (meaning re-convert) the object code back to C with a decompiler such as for example Boomerang [archive]. Quote Boomerang: Help! I've lost my source code [archive], which is putting a price tag on it:
How much will it cost? You should expect to pay a significant amount of money for source recovery. The process is a long and intensive one. Depending on individual circumstances, the quality, quantity and size of artifacts, you can expect to pay upwards of US$15,000 per man-month.
Try to solve the question of how to disassemble a binary (byte code) into assembly source code and re-assemble (convert) to binary?
- Tricks to Reassemble Disassembly [archive]
- https://stackoverflow.com/questions/6327862/ida-pro-asm-instructions-change [archive]
- Why there are not any disassemblers that can generate re-assemblable asm code? [archive]
- https://reverseengineering.stackexchange.com/questions/3203/recompile-the-asm-file-ida-pro-created [archive]
- https://www.researchgate.net/publication/323249543_Superset_Disassembly_Statically_Rewriting_x86_Binaries_Without_Heuristics [archive]
- https://gist.github.com/jarun/ea47cc31f1b482d5586138472139d090 [archive]
- How to disassemble a binary executable in Linux to get the assembly code? [archive]
- Use GCC and objdump to disassemble any hex to assembly code [archive]
nasm -felf64 hello.asm
ld hello.o -o hello
4. objdump (optional).
objdump -d hello
5. Exercise for the reader: disassemble
The GNU Hello [archive] program source file
hello.c[archive] at time of writing contains
objdump -d /usr/bin/helloon Debian buster has
objdump -d /usr/bin/hello
- See for example how difficult it was to reverse engineer Skype. Skype Reverse Engineering : The (long) journey ;).. [archive]
- Take all the Debian package maintainer scripts. Are these easier to review as is, most of them are written
bashor if these are converted to a program written in C, closed source, precompiled?
- Do we prefer if OnionShare stays written in python, Open Source or do we prefer the project turned into a precompiled binary?
- Take all the Debian package maintainer scripts. Are these easier to review as is, most of them are written
- salary comparison
- How much does a security audit cost reverse engineering vs source-available?
- https://lists.debian.org/debian-security-announce/2008/msg00152.html [archive] (w [archive])
- Debian also participates in security standardization efforts and related overarching projects.
- And undoubtedly advanced adversaries.
- That said, a skilled, malicious coder is far more likely to introduce subtle errors that open non-obvious attack vectors.
- This is a good starting point to understand how Whonix ™ works.
- https://forums.whonix.org/t/whonix-warrant-canary/3208/18 [archive]
- Meaning doubts should surface if a new canary was not issued for longer than 4 weeks.
If issues arise with the
whonix.orgserver, this ensures the canary is always available online.
- This feature has been available since Whonix ™ 0.4.5
- Verifiable Builds allow auditors to check if there is hidden code inside Whonix ™.
- Using quotes since this is not well defined.
- Even then, capable adversaries have hacked their servers in the recent past; see here [archive].
- Whonix ™ developer (w [archive]), named proper in the past [archive] (w [archive]), renamed himself to adrelanos [archive] (w [archive]), published his OpenPGP key on 05/29/12 [archive] (w [archive]) (wiki history [archive] (w [archive])). Revealed his identity on 01/18/14. [archive] (w) [archive] Patrick Schleizer posted his OpenPGP key transition message on 01/18/14, signed by both his old and new key. [archive] (w) [archive]
- https://en.wikipedia.org/wiki/Open_Virtualization_Format [archive]
- This feature only adds security if people actually use it. Do not assume that someone else will do it for you
- Due to build machine compromise.
- https://en.wikipedia.org/wiki/Backdoor_(computing) [archive]
- Whonix is based on Debian.
- Some Debian developers are steadily working on this long-term project, see: Reproducible Builds [archive].
- https://en.wikipedia.org/wiki/Vulnerability_(computing) [archive]
- https://en.wikipedia.org/wiki/Exploit_(computer_security) [archive]
- https://en.wikipedia.org/wiki/Phoning_home [archive]
Open Source software does not automatically prevent backdoors [archive], unless the user creates their own binaries directly from the source code. People who compile, upload and distribute binaries (including the webhost) could add hidden code, without publishing the backdoor. Anybody can claim that a certain binary was built cleanly from source code, when it was in fact built using the source code with a hidden component. Those deciding to infect the build machine with a backdoor are in a privileged position; the distributor is unlikely to become aware of the subterfuge.
Deterministic builds can help to detect backdoors, since it can reproduce identical binary packages (byte-for-byte) from a given source. For more information on deterministic builds and why this is important, see:
- liberationtech mailing list: Deterministic builds and software trust [archive].
- gitian.org [archive]
- As Mike Perry has observed: Deterministic Builds Part One: Cyberwar and Global Compromise [archive]. See:
- The Debian wiki tracking progress / development efforts to implement Reproducible Builds for all packages [archive].
- See Tails Roadmap [archive].
- See Deterministic Builds Part One: Cyberwar and Global Compromise [archive] and Deterministic Builds Part Two: Technical Details [archive].
- corridor only uses shell scripts.
- To be fair, there are no deterministically built operating systems yet. It is a difficult process and takes a lot of effort to complete. While Debian has around 22,000 reproducible packages [archive] in mid-2018, this work has been ongoing since 2013 and is far from done.
The first form of backdoor [archive] is a vulnerability [archive] (bug) in the source code. Vulnerabilities are introduced either purposefully or accidentally due to human error. Following software deployment, an attacker may discover the vulnerability and use an exploit [archive] to gain unauthorized access. Such vulnerabilities can be cleverly planted in plain sight [archive] in open source code, while being very difficult to spot by code auditors. Examples of this type of backdoor include:
- An attempt to backdoor the kernel [archive].
- The Debian SSL debacle [archive]; many argued that this wasn't a bug but in fact a backdoor, as it hadn't been spotted for several years.
It is therefore impossible to claim that non-trivial source code is backdoor-free, because backdoors can be hidden as vulnerabilities. Auditors scrutinizing the source code can only state an opinion about the quality of the source code, and eventually report vulnerabilities if/when they are identified. Assertions that source code is free of computer viruses (like trojan horses) is the only reasonable assertion that can be made.
- Although theoretically possible, there are no mathematically proven bug-free [archive] operating systems yet.
- The upstream distribution is the distribution on which the project is based. Whonix and Tails are based on Debian, thus Debian is their upstream distribution. QubesOS TorVM is based on Qubes OS, which is itself based on Fedora and Xen.
- See verifiable builds.
- Whonix relies on the tireless efforts of Debian and other upstream projects.
- Because in order to implement the verifiable builds feature, a lot of non-deterministic, auto-generated files are removed at the end of the build process and re-created during first boot.
- It is not actually impossible, but it would require significant effort.
- Old advice: Since Whonix 7.5.2, all Whonix Debian Packages have been deterministically built. This means if the Whonix Debian Packages 7.5.2 are built from source code, and 7.5.2 downloaded from the Whonix Debian repository, it is possible to diff the checksum (for example the sha512sum) of those files and they should match. This has been deprecated because of a dpkg bug. The estimate of the Installed-Size can be wrong by a factor of 8, or a difference of 100MB [archive] (note: this bug has now been resolved). Different underlying file systems cause different file sizes, leading to checksums not matching.
- When Whonix ™ APT repository is disabled, there is no updater - as was the case in Whonix ™ 0.5.6 and below.
- At the moment, Whonix ™ developer Patrick Schleizer is the only one holding the Whonix ™ APT repository OpenPGP signing key.
- See the Verifiable Builds section for further details.
- Since Whonix ™ version 7.3.3
- To disable this setting, see: qubes-template-whonix [archive]:
builder.conf[archive], and set
DERIVATIVE_APT_REPOSITORY_OPTS = off
- After running
sudo apt-get update && sudo apt-get dist-upgrademanually or via a GUI updater.
- Whonix ™ developers place little trust in the CA model. Even if the numerous implementation problems were solved, such as problematic revocation and the ability for every CA to issue certificates for anything (including "*"), third party trust cannot be established. Until an alternative arrives and is widely adopted, everybody has to rely upon SSL/TLS to some extent.
- https://www.se.rit.edu/~samvse/publications/An_Insider_Threat_Activity_in_a_Software_Security_Course.pdf [archive]
- TrueCrypt has been discontinued.
- https://mailman.stanford.edu/pipermail/liberationtech/2013-June/009257.html [archive]
- https://trac.torproject.org/projects/tor/ticket/3688 [archive]
- Interested readers can investigate its complexity by searching with the phrase "trusting trust".
- The time period is likely to be shorter for smaller projects, perhaps less than 12 months.
- For example, by casting unjustified aspersions.
- In the case of Whonix ™, binaries are not distributed nor created. Only unmodified upstream binaries are distributed, along with shell scripts. This claim is much easier to verify than if Whonix ™ were distributing binaries from project source code.
Whonix ™ Trust wiki page Copyright (C) Amnesia <amnesia at boum dot org>
Whonix ™ Trust wiki page Copyright (C) 2012 - 2021 ENCRYPTED SUPPORT LP <email@example.com>
This program comes with ABSOLUTELY NO WARRANTY; for details see the wiki source code.
This is free software, and you are welcome to redistribute it under certain conditions; see the wiki source code for details.
This is a wiki. Want to improve this page? Help is welcome and volunteer contributions are happily considered! Read, understand and agree to Conditions for Contributions to Whonix ™, then Edit! Edits are held for moderation. Policy of Whonix Website and Whonix Chat and Policy On Nonfreedom Software applies.
Copyright (C) 2012 - 2021 ENCRYPTED SUPPORT LP. Whonix ™ is a trademark. Whonix ™ is a licensee [archive] of the Open Invention Network [archive]. Unless otherwise noted, the content of this page is copyrighted and licensed under the same Freedom Software license as Whonix ™ itself. (Why?)
The personal opinions of moderators or contributors to the Whonix ™ project do not represent the project as a whole.