This essay offers a practical tip in transparency as part of establishing good governance in private companies, non-profit organizations, and public institutions. It is the second part of a nine-part series to support organizations in modernizing their information systems.
A computer is a wonderful thing. It can streamline procedures, making tasks less arduous to perform. A computer is also an object to be feared. Its ability to store mass information means that facts—both the good and less flattering—can expose leaders and organizations to certain dealings that insiders may wish not to divulge to the public. People may grow afraid of the computer by how much it could reveal about individuals’ lives.
Rapid advances in computing brings the world to an inflection point in how much information should be released to the general public. Should every piece of data about the minutiae of business operations in corporate and government life be made available for everyone to see? Should an organization be absolutely transparent in its operations?
The benefits of open organizations are obvious. Leaders can be held accountable when projects fail and unlawful acts arise. The public becomes aware of particular situations and learns how business runs. Democracy is strengthened as citizens become informed and engaged.
On the other side of the debate, what are the costs to being open and forthright? Can an organization be harmed by transparency? Would people have more to lose than what they could gain from a full and open society?
Along with the rise in the use of the World Wide Web (WWW) since the 2000s, governments around the world have introduced e-government[1] programs to facilitate public administration and provide government services online. Underlying the theme of e-government is the movement to make government more responsive to the needs of citizens. Implementing web applications and other modern information and communication technologies have made government open, allowing people to interact efficiently with government agencies.
The drive toward open institutions, however, has created new problems, not just for public organizations but for private companies as well. The ease with which data can be transferred between multiple organizations poses a risk in sending information into the wrong hands. Websites connect to various third parties to receive text and images and to share preferences. The interconnectedness of the WWW enables data to flow to whomever and wherever there’s an established link to receive information. The WWW is made up of numerous links, connecting people and things around the world in a global computer network.
Transparency is a paradox, illuminated by David Pozen, in that it both solves and creates problems. Pozen explained how open government has undermined democratic principles and transparency is not a virtue to uphold but rather transparency is a means to an end.[2]
To understand transparency: one needs to look at what would be lost relative to what would be gained when information is disclosed or withheld. Pozen examined the tradeoffs of privacy and articulated that protecting privacy along one dimension may compromise it along another dimension.[3] In effect, a person or an organization may choose to disclose certain information to protect a certain aspect of privacy, but then the very same disclosure may result in leaving the person or organization vulnerable to another form of privacy risk. There’s no universal definition of what privacy means. A privacy interest can form in a variety of situations. A homeowner may install video cameras in all the rooms of his house to ensure protection of personal belongings against theft. The homeowner, in this instance, gives up his interest in keeping all of his possessions private. The installed cameras, while serving to protect the homeowner’s property, may be used to snoop on the private activities of the homeowner and his guests.
A corporation may fully disclose its methods of a novel product and receive a patent for its invention. In exchange for giving up its secret in how its product delivers a better service to customers, the corporation can dominate the market and charge a high price for the innovative product. The corporation gives up its interest in maintaining secrecy in its product design. The disclosure may leave the corporation vulnerable to a number of privacy concerns where the invention can be reverse engineered by others to exploit technical weaknesses, identify potential suppliers, or find other things that the corporation may not want to be publicly known.
Multiple government agencies may agree to share client records related to vaccinations, voter registration, and unemployment compensation. These agencies use a secured web application to transfer information about individuals who participate in various programs. While a computer system is in place to protect the data from unauthorized access, the ability to share disparate data sets opens up the possibility for personnel to combine the data sets to create comprehensive profiles on thousands of residents. The combined single database poses new risks that may expose vulnerable populations to increased scrutiny and potential abuse.
The foregoing examples illustrate privacy tradeoffs across different organizational contexts. Compromising one aspect of privacy for the benefit of another is not only a concern in the public sector but can be an issue in private business and personal life. A person needs to think through all possible cases in which disclosure might impact other areas.
Legal scholars have developed frameworks to mitigate and balance privacy risks. One framework organizes data into four categories. By reviewing data as belonging to raw personal data, pseudonymized data, anonymized data, or non-personal data,[4] an analyst would be able to determine the extent to which a particular group of information can be made available for external use. The definitions of the four categories may be arranged along a continuum where personally identifiable information (PII) makes up raw personal data at one end of the scale and non-personal data (for example, weather data or public transportation scheduled routes) are at the other end of the scale. Non-personal data generally do not contain PII. Pseudonymized data and anonymized data would fall somewhere in the middle with the former category closer to the raw side and the latter category approaching near the non-personal side. Information that’s pseudonymized may not readily point to a particular person; but with some level of effort, pseudonymized data could be deciphered to reidentify an individual.[5] On the other hand, techniques to anonymize a person or a group of people can make it more difficult to reidentify specific individuals. Along the continuum, information that’s non-personal and anonymized poses little risk to privacy in comparison to raw data and pseudonymized data. While raw personal data should not be a part of an open data initiative, non-personal data may be considered for public release.
Another framework draws on information security practices to analyze the risks of data releases according to privacy controls, privacy threats, privacy harms, privacy vulnerabilities, and data utility.[6] Altman and his team applied these dimensions across the phases of information management from data collection to data processing to the releases of results and data sets, highlighting technical and managerial approaches that preserve privacy interests and permit certain levels of access.
Two computing methods mentioned in Altman’s study are worth elaborating here. The methods provide mechanisms to avoid having to transmit raw personal data. Instead of accessing raw data directly, external users can use a specialized web application to select certain predefined parameters and submit a query. The specialized web application controls what types of data and the amount of information that can be displayed to the user. The user would only receive information in a limited and constrained format based on the result of the query. The second method (an emerging technology) is homomorphic encryption,[7] where encrypted data can remain encrypted in storage and yet can be searched and analyzed. An organization wouldn’t have to decrypt and release a data set in plain text, but rather can provide access through a specialized software program that allows an authorized user to process encrypted data without the need of actually seeing it in raw form.
As advancements in information technology (IT) are rapidly developed and brought to market, there’s a new concern regarding how government acquires the latest technology.[8] Machine learning and artificial intelligence (AI) may be delivered in such a form that public managers don’t understand how the systems operate to process and output information. In the past, public managers were involved in the design phase to ensure contractors had developed an IT system according to a government program’s specific business rules and functional requirements. Public managers understood the logic of information management systems, because of explicit instructions that were programmed in those systems. Today’s machine learning systems may rely on implicit rules and assumptions (gained from training data) that enable algorithms and computational methods to generate inferences and conclusions based not on human logic but on machine logic [emphasis mine].[9] The way a machine thinks is not the same as how a human thinks. This is the critical difference at play in the development of AI. A machine learning system used in a government program may be opaque to a high degree that allows decisions and the administrative process to elude accountability, precisely because machine logic functions on a separate plane.
To balance secrecy and disclosure whether it’s a black box IT system, national security classifications, or information more broadly, organizations can establish a panel of subject matter experts to oversee the administrative process and provide technical guidance. The Fundamental Classification Policy Review (FCPR), initiated by the Secretary of Energy in 1995, brought together technical experts across the Department of Energy to review the department’s entire classification system. A guiding principle in the FCPR’s effort was not to find a benefit to release information but to answer the question why certain information must be protected based on reasoned judgment.[10] What’s the inherent risk? Experts reviewed and updated their criteria to be in line with current needs and risk levels in maintaining classified documents.
The Interagency Security Classification Appeals Panel (ISCAP) is an example of a multiagency forum composed of senior level representatives from the Departments of Defense, Justice, and State, the National Archives and Records Administration, the Central Intelligence Agency, the Office of the Director of National Intelligence, and the National Security Council. ISCAP has authority to review public appeals to declassify and release previously classified information, permitting group members to deliberate and come to a consensus on what should and should not remain secret. Aftergood showed how the group-led effort provides an effective check on agency discretion with the ISCAP voting to declassify 495 of 769 classified documents (64% of the time) from 1996 to 2008.[11] Left on its own, an agency may well decide to continue to retain a classified document for several decades. Brought together in a forum, other agencies would weigh in on the matter and discuss the merits of why a particular document needs to be withheld from public inspection.
During the Obama administration, the U.S. Digital Service (USDS) and 18F were established to facilitate government-wide efforts to design and implement web applications and IT infrastructures. These two groups assembled technical teams to provide guidance and support to federal agencies. In a review of similar teams created at the state and local levels, Mulligan and Bamberger explained how USDS and 18F had exemplified a new model for leading and coordinating cross-agency efforts to manage information technology programs.[12]
It needs to be noted that, under the second Trump administration, 18F was shuttered[13] and USDS was reorganized as the “U.S. Department of Government Efficiency (DOGE) Service” with an expanded scope that reaches into or touches upon human resources management, contracts and grants management, federal funding, rulemaking, voter registration, and nuclear technology.[14] The new DOGE Service has been given authority by President Trump to provide more technical guidance beyond the field of computer science. A future review of USDS operating as DOGE should be conducted to determine, among other things, if the group continues to represent a model of cross-agency IT coordination.
The impetus behind the establishment of FCPR, ISCAP, 18F, and the original USDS was the need to understand the complexities of changing situations and to grapple with evolving circumstances. The single agency, multiagency, and government-wide groups demonstrate the value of soliciting expertise from others to deal with a particular environment that’s deeply technical.
Determining what should be disclosed and what should remain secret is not an easy task. A lone person or a single organization shouldn’t decide too hastily to release information without examining the impacts of the release. A team of competent individuals is better positioned to evaluate the risks, pointing out any downsides that another person may not have considered or may not be aware of. Consulting partner organizations can afford them an opportunity to provide feedback on how a data release may impact their operations. A partner may have objections to the proposed disclosure.
A decision to make information publicly available is not a trivial matter without consequences. Once it’s released, the now public information may be difficult to pull back, control, and hold again. Disclosing previously held information can be like releasing the proverbial genie out of the bottle. Leaders should be cognizant on how information can spread—especially when it’s transmitted across the World Wide Web.
Evidence for Practice
Preserving privacy varies by context and can change from one situation to another.
Releasing or withholding information needs to be reasonably justified.
Consulting relevant experts on deciding what to disclose allows for a discussion that will result in a reasoned judgment to keep information confidential or make it public.
Next Steps for Leaders
Leaders should regularly review their criteria on classifying information and determine if requirements and needs have changed. Factors involving changes to laws and regulations, changes to internal operations, and changes to agreements with partner organizations may require updates to classification criteria and operating procedures. Sunshine laws, disclosure requirements, data sharing, and privacy protection are particular areas to focus on. The terms classification, classifying, and classified documents are meant to be interpreted broadly to apply to any organization (public and private) that has a need to retain confidential information for internal use only.
An internal committee should be established to review and approve changes, provide technical guidance, and oversee the administrative process on matters related to information protection, retention, and transmission. The committee should include representatives from external partners if those partners have agreements to share and use information.
Government agencies (federal, state, and local) should review their procurement procedures and make any necessary adjustments to adequately evaluate and test AI systems before they can be used in operations. Contractors may be unwilling or unable to disclose their algorithms, computational methods, and other system design elements. If public managers aren’t able to provide instruction in the course of designing an AI system, they can and should provide direction on what outputs the system must produce. Procurement focus would change to ensuring system outputs match the expectations of a government program.
Open for Discussion
Is your organization completely transparent in how it operates? How often do you discuss the risks of disclosing certain types of data? Have you ever had an instance where you regretted the release of information? Feel free to submit your answers in the comment section. If you’re not comfortable sharing your experiences, you can talk to a specialist at Peaceful Governance Institute (PGI) in private; click this link to call the PGI hotline support.
Notes
1. E-government is shorthand for electronic government.
2. David E. Pozen, (2020) “Seeing Transparency More Clearly,” Public Administration Review 80(2): 326–327.
3. David E. Pozen, (2016) “Privacy-Privacy Tradeoffs,” The University of Chicago Law Review 83(1): 222.
4. Frederik Zuiderveen Borgesius, Jonathan Gray, and Mireille van Eechoud, (2015) “Open Data, Privacy, and Fair Information Principles: Towards a Balancing Framework,” Berkeley Technology Law Journal 30(3): 2077, 2114–2122.
5. Pseudonymized data typically replaces PII with another kind of unique identifier. Instead of using a person’s driver’s license number, for example, a six-character alphanumeric code may be used to identify the person without referencing his driver’s license number. The method of creating the code is only known to the owner of the data and not shared with any third party.
6. Micah Altman, Alexandra Wood, David R. O’Brien, et al., (2015) “Towards a Modern Approach to Privacy-Aware Government Data Releases,” Berkeley Technology Law Journal 30(3): 2011–2013.
7. For more information, see the following: Martin Albrecht, Melissa Chase, Hao Chen, et al., (2018) “Homomorphic Encryption Standard,” 21 November, (Toronto: HomomorphicEncryption.org). (Accessed 31 March 2026 at https://homomorphicencryption.org/standard/.) and Craig Gentry, (2009) “A Fully Homomorphic Encryption Scheme,” PhD Dissertation, September, (Palo Alto, CA: Stanford University). (Accessed 31 March 2026 at https://crypto.stanford.edu/craig/craig-thesis.pdf.)
8. Deirdre K. Mulligan, and Kenneth A. Bamberger, (2019) “Procurement as Policy: Administrative Process for Machine Learning,” Berkeley Technology Law Journal 34(3): 773–852.
9. Mulligan and Bamberger: 814–817.
10. Steven Aftergood, (2009) “Reducing Government Secrecy: Finding What Works,” Yale Law & Policy Review 27(2): 409–410.
11. Aftergood: 407.
12. Mulligan and Bamberger: 830–833
13. Jason Miller, (2025) “After Rocky History, GSA Shuts Down 18F Office,” Federal News Network, 1 March. (Accessed 26 April 2026 at https://federalnewsnetwork.com/reorganization/2025/03/after-rocky-history-gsa-shuts-down-18f-office/.) and Karoun Demirjian, and Madeleine Ngo, (2025) “Dozens of Government Technology Specialists Fired,” New York Times, 3 March. (Accessed 26 April 2026 at https://www.nytimes.com/2025/03/03/us/politics/18f-technology-specialists-fired.html.)
14. Executive Office of the President, (2025) “Establishing and Implementing the President’s ‘Department of Government Efficiency’,” Executive Order 14158, 20 January, Federal Register 90(18): 8441–8442. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-02005.)
Executive Office of the President, (2025) “Reforming the Federal Hiring Process and Restoring Merit to Government Service,” Executive Order 14170, 20 January, Federal Register 90(19): 8621–8623. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-02094.)
Executive Office of the President, (2025) “Implementing the President’s ‘Department of Government Efficiency’ Workforce Optimization Initiative,” Executive Order 14210, 11 February, Federal Register 90(30): 9669–9671. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-02762.)
Executive Office of the President, (2025) “Ending Taxpayer Subsidization of Open Borders,” Executive Order 14218, 19 February, Federal Register 90(36): 10581–10582. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-03137.)
Executive Office of the President, (2025) “Ensuring Lawful Governance and Implementing the President’s ‘Department of Government Efficiency’ Deregulatory Initiative,” Executive Order 14219, 19 February, Federal Register 90(36): 10583–10585. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-03138.)
Executive Office of the President, (2025) “Implementing the President’s ‘Department of Government Efficiency’ Cost Efficiency Initiative,” Executive Order 14222, 26 February, Federal Register 90(40): 11095–11097. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-03527.)
Executive Office of the President, (2025) “Preserving and Protecting the Integrity of American Elections,” Executive Order 14248, 25 March, Federal Register 90(59): 14005–14010. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-05523.)
Executive Office of the President, (2025) “Zero-Based Regulatory Budgeting To Unleash American Energy,” Executive Order 14270, 9 April, Federal Register 90(71): 15643–15646. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-06466.)
Executive Office of the President, (2025) “Ordering the Reform of the Nuclear Regulatory Commission,” Executive Order 14300, 23 May, Federal Register 90(102): 22587–22590. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-09798.)
Executive Office of the President, (2025) “Reforming Nuclear Reactor Testing at the Department of Energy,” Executive Order 14301, 23 May, Federal Register 90(102): 22591–22593. (Accessed 26 April 2026 at https://www.federalregister.gov/d/2025-09799.)
