Managing Information for Accurate and Timely Decisions
Should AI be used or is there another method?
This essay offers a practical tip in the information management area within good governance. It is part of the monthly series on strengthening governance in private companies, non-profit organizations, and public institutions.
“Acquiring and using information is a cornerstone of economic activity. In order to channel resources to their most productive ends, needs and capabilities must be identified.”[1]
Any organization, whether it’s a small business, a community center, or a public institution, needs information (data more precisely) to make accurate decisions. Without information, an organization may make a mistake or fail to reach its goals. How will the manager know how much to produce? How many customers should be served? How many dollars need to be spent and received? Answering these questions necessitate gathering and analyzing information. Collecting information is just an important task in business as it is with managing financial transactions and hiring employees.
As with anything else in business, information has a cost. Purchasing a mailing list has a certain price. Tools to manage and analyze data come with various prices. Cloud computing or a computer server has a cost to store and process data. Even without modern conveniences provided by the Internet, managing records on paper by hand with pencil and calculator has a cost involved. Time itself adds to expenditures in terms of the level of effort necessary to collect, review, and catalog information.
Given information’s relative importance in business and in decision-making, how can an organization control its expenses in obtaining and processing information? Are there practical strategies to minimize information collection without sacrificing quality? Does an organization need to acquire all available data?
A casual observer might say information management is cheap and even free. One can download and use all sorts of software apps for little to no cost. Another can subscribe to free web services. None of these options are actually free. It may appear to be free to the user. But, in fact, costs are borne by the software developer or service provider. Using freeware and shareware software applications and the risks they bring is a whole separate topic for another essay. For now, I will offer the common refrain, “Buyer beware,” which remains even more relevant today with modern information and communication technologies.
In the drive to develop artificial intelligence (AI) to a level that matches human thinking capability, tech companies have based their data models on grabbing whatever information is available on the World Wide Web, regardless of whether the information is a fact, an opinion, or a lie. Data are indiscriminately being gathered and stored like a vacuum cleaner sucks up dust and debris. OpenAI, the developer of ChatGPT, asserts that the performance of language models can improve by adding more data by the billions. Software engineers tested OpenAI’s assertion with a data model comprising 1 billion parameters over a collection of 20.3 million documents containing 16.2 billion words.[2] It’s amazing how technology has evolved. The latest feats in computing, nevertheless, don’t resolve the fundamental question on whether the generated output is true or relevant.
In my own research, I explored the history of data analysis and was struck by the insight of John Tukey, a renowned statistician. Tukey reminds us that statistics is one facet of data analysis and that statistics should be used as a basis to form a judgment, not to validate a proposition.[3] Data analysis is a scientific discipline that requires a rigorous inquiry into the problem at hand.
Users of information technology (IT) and AI especially need to be wary about the software that they’re using to make forecasts and decisions. A person can do immediate checks by corroborating computer results with other sources. Utter reliance on AI output could lead to pursing a wrong path, purchasing a faulty product, issues that negatively impact personal well-being, or perhaps war. One study created a provocative fictional story in which the U.S. and China relied on a stream of outputs from their AI systems, and the results had led both countries to launch nuclear missiles against each other.[4] The authors of the study showed how generated outputs could not present the real reason behind each state’s actions. What was missing in the output was the context of what was really happening on the ground. Leaders in the war game scenario had made a grave decision based on a false narrative that failed to bring all the data points together into a coherent whole.
Despite its technical limitations, AI will be used for better or worse. I remain optimistic that AI will reach maturity. Chatbots, in their present form, communicate like adolescents. They still have much to learn, indeed.
In terms of business application, there’s a better alternative. Advanced IT systems with data analytical tools[5] exist that will enable companies to process their production data to create efficiencies in how they manufacture their products. What is more important is choosing the right kind of information to enter into the IT system. Companies themselves have a wealth of internal information in their offices and production facilities. The emphasis here is on using internal, proprietary data as the input in IT systems. Organizations don’t need to rely on an open-source system where much of its external information is questionable. A private system containing the organization’s proprietary data will ensure that managers receive valid results.
While adoption of information technology had risen dramatically from 2005 to 2010, Brynjolfsson and McElheran found that the adoption rate was uneven across the manufacturing sector with 70% of manufacturing plants choosing not to adopt.[6] Study results indicate that non-adopters tend to be those companies where leaders play a more hands-on role in production activities and the tenure of managers is longer.[7] The study suggests that the decision to use advanced IT systems is largely dependent on the size of the company. Large producers are likely to adopt while small producers are not. This difference would correlate with a firm’s ability to invest in information technology. Small companies may neither have the cash to purchase computer hardware and software nor have the expertise to use such equipment.
Organizations can find other avenues and develop methods to analyze their data without being heavily reliant on information technology. To start: leaders can utilize the talents of their employees who have expert knowledge or years of experience. Organizations have already invested in natural intelligence by employing people. Workers have the brain power to understand the operating environment, troubleshoot whenever something goes wrong, and make necessary corrections. Workers are available resources to process internal information.
Crawford and Sobel developed a seminal paper on how information can be collected and transmitted between two parties, a principal and an agent. Numerous studies were produced in the literature to explore and extend Crawford and Sobel’s concept of strategic communication. The core question is in deciding how much information to send to a decision-maker. Crawford and Sobel’s contribution was in finding that communication can proceed when the goals of both parties are closely related.[8] In an organization, the manager’s and worker’s interests would be aligned. An external partner who contracts with an organization would have shared goals with the organization. In these two cases, there would be an incentive to communicate and transfer information. Shared or similar goals or interests among internal and external parties forms the bedrock for knowledge sharing and information transfer.
The literature suggests that there are three general avenues for collecting information for decision-making purposes.[9] It could be centralized where the principal (a corporate leader or a single organization) collects all information and makes a decision. Centralization of all efforts could be time consuming for the principal. In another avenue, the principal can delegate all work including the decision to an agent. Delegating all efforts to another person or entity might be less expensive but poses high risk in depending on someone else to make critical decisions. The third avenue divides the task of information collection to an agent and the decision-making task to the principal. In this last approach, the leader would wait a certain period for his expert to analyze acquired information. Once the expert sends her analysis to the leader, the leader will decide on a particular action based on the expert’s results. One can see that most organizations would employ the third avenue for information collection and decision-making.
Studies examined how the different avenues play out to achieve equilibrium or an optimal environment. One study considers the factor of time to determine the effect of making a decision early or later in the process. Grenadier and his team found differing effects from the perspective of an agent in that an early decision is beneficial under delegation and a later decision is optimal under centralization.[10] Argenziano and her team found that a decision (made by the principal) may be more precise when advice is provided by an expert (the agent), regardless of the amount of acquired information.[11] Argenziano’s conclusion suggests that collecting more information may not be necessary. Less information may be sufficient, so long as it is valid. It’s assumed that the expert would only acquire just enough information as needed to provide credible and justifiable recommendations. It is clear from the literature that an agent carries weight and influence to provide advice to a principal.
The burden of data acquisition and analysis is placed on an agent. Just how much data should be collected? A better question to ask is how can all the collected information be reviewed and analyzed quickly?
Mandler examined a method by organizing data in less fine (more coarse) categories.[12] His method reduces complex information down to manageable chunks (binary categories), which would allow a person to make a quick decision. Instead of reviewing many solutions in great detail, one can generalize solutions into a smaller set of broad scenarios. An implication of such an approach is in cost savings. Evaluating the former approach would entail a higher cost, additional time, and more information in comparison to the latter approach. Some degree of precision may be lost in generalizing categories, but the same objective and intent would still be achieved. Mandler’s method implies less information can be collected.
The idea of drilling down complexity to binary categories may facilitate work for analysts to quickly weigh the pros and cons of competing priorities. Mandler’s method may be useful in a government agency where analysts are constrained by time to evaluate multiple policy options. Each option or priority would be equally time consuming to collect and analyze relevant data specific to each one.
The struggle to provide equal treatment to competing priorities is compounded by the interests of political leaders. Leadership may be biased toward one particular priority over another (building weapons versus providing health care, for example). And while a civil servant who conducts the analysis would evaluate each priority on its merits in a non-partisan manner, the civil servant still faces time pressure to complete an unbiased analysis. Patty examined this situation and found that when a political leader’s preferences are known, the civil servant will likely follow the leader’s prerogative and invest her time and energy in evaluating the leader’s preferred priority.[13] The other priority loses out. This is not to say that the least preferred priority would not be evaluated fairly. The civil servant would be driven to gather information that the leader cares about most. That means focusing on the leader’s preferences. Patty’s study shows that in the U.S. government it is the process to collect and analyze information related to a particular policy option that is politically motivated. The civil servant is not biased. It’s the process.
The civil servant performs her job to develop a credible case and deliver it to the leader. The leader can take the advice or ignore it. The civil servant has no control on what the leader ultimately decides.
In sum, various organizations from the public sector to the private sector have options to better manage and process their internal information for improved decision-making. If funding is available and capacity exists to use technology, an organization can invest in an advanced IT system with data analytical tools and the IT infrastructure that supports those tools. Leaders should carefully weigh the risks and opportunities if they intend to use an AI chatbot as the emerging technology remains relatively immature.[14] Unintended consequences may result when decisions are based solely on AI outputs.
With or without information technology, an organization still needs procedures and a methodology for how it manages information in general and conducts data analysis in particular. It could control the entire process internally or contract out to partners. Leaders need to develop a strategy and a protocol to ensure all parties understand their roles and when results of the analysis are needed to make timely decisions. Timing may be critical in certain situations where a decision must be made at a specific point in time.
In all instances of making decisions, the leader wants to be certain that his decision is based on credible information. It is not necessary to collect all available data, parts of which may be irrelevant or dubious. Data quality remains more important than data quantity. A few factual and authoritative documents would build a stronger case than an exhaustive compilation of unsubstantiated opinions. An organization can defer to staff members’ expertise and experience to fill in any gaps and synthesize the limited collection of acquired information. A person remains involved in communicating recommendations to the leader, who in turn will decide on which recommendation to pursue.
While information technology can certainly play a role in information management, organizations should not discount the ability of the human operator. People still have roles to oversee the process and to make the actual decision. Human intelligence remains more capable than artificial intelligence in the areas of nuance, context, and synthesis.
Evidence for Practice
AI chatbots are wholly inappropriate for complex, sophisticated decision-making where the results of critical decisions can have consequential impacts.
Involving people in a coordinated effort to collect, analyze, and report information remains crucial to good management practice.
Experts assigned to evaluate options for decision-makers can save time by limiting their information collection effort to a small and strategic set of credible data. Less information may produce meaningful and fruitful advice.
Next Steps for Leaders
Leaders should review their procedures and methods pertinent to information management, information collection, and data analysis. If data collection forms are long or complicated, consider revising the forms by simplifying or generalizing the questions and responses. Can the forms by shortened to obtain only what you need for your particular project or goal? Do staff and partners understand their roles and responsibilities to ensure the information process from collection to analysis runs smoothly? Will the decision-maker receive the results of the analysis in time to make a decision? Is it clear who makes the decision? Is it a machine or a human who decides on what action to take?
Leaders should analyze their operating environment to determine what’s the best and most suitable IT solution that will meet business needs and organizational objectives. All division managers should be consulted to get their input and secure their buy-in. Are there old and outdated IT systems that need to be upgraded? Would an enterprise wide IT system work across all divisions, or should each division have its own system? Do current employees have skills and abilities to use an advanced IT system with data analytical tools?
Open for Discussion
How does your organization manage and analyze information? Are you using ChatGPT or some other AI chatbot to help you make decisions? Did you encounter any problems in the outputs? Feel free to submit your answers in the comment section. If you’re not comfortable sharing your experiences, you can talk to a specialist at Peaceful Governance Institute (PGI) in private; click this link to call the PGI hotline support.
Notes
1. Jerry Green, (1982) “The Current Status of the Interface Between Information Science and Economics,” in Research Opportunities in Information Science and Technology, (Alexandria, VA: National Science Foundation): 17. Accessed 28 February 2026 at https://green.scholars.harvard.edu/sites/g/files/omnuum5981/files/green/files/the_current_status_of_the_interface_between_information_science_and_economics.pdf.
2. Jared Kaplan, Sam McCandlish, Tom Henighan, et al., (2020) “Scaling Laws for Neural Language Models,” arXiv:2001.08361v1, (Ithaca, NY: Cornell University): 4, 7. Accessed 28 February 2026 at https://doi.org/10.48550/arXiv.2001.08361.
3. Edward Y. Uechi, (2023) “Chapter 2: Technological Development,” in Business Automation and Its Effect on the Labor Force, (New York: Routledge/Productivity Press): 30–31.
4. Matthew Price, Stephen Walker, and Will Wiley, (2018) “The Machine Beneath: Implications of Artificial Intelligence in Strategic Decisionmaking,” PRISM 7(4): 93–95.
5. Advanced IT systems are enterprise grade information management systems designed specifically to meet the needs and requirements of particular organizations, typically built from a base design provided by IBM, Microsoft, or Oracle. The advancement includes tools and proprietary methods for users to carry out data analysis functions. An organization’s internal data are imported for deep analysis. The IT system operates in a closed network that’s restricted to employees.
6. Erik Brynjolfsson, and Kristina McElheran, (2016) “The Rapid Adoption of Data-Driven Decision-Making,” American Economic Review: Papers & Proceedings 106(5): 138.
7. Brynjolfsson and McElheran: 136–137.
8. Vincent P. Crawford, and Joel Sobel, (1982) “Strategic Information Transmission,” Econometrica 50(6): 1450.
9. I summarize the three avenues and add my interpretation in the paragraph based on cogent and succinct descriptions provided in: Rossella Argenziano, Sergei Severinov, and Francesco Squintani, (2016) “Strategic Information Acquisition and Transmission,” American Economic Journal: Microeconomics 8(3): 136–137.
10. Steven R. Grenadier, Andrey Malenko, and Nadya Malenko, (2016) “Timing Decisions in Organizations: Communication and Authority in a Dynamic Environment,” American Economic Review 106(9): 2570–2571.
11. Argenziano et al.: 140.
12. Michael Mandler, (2020) “Coarse, Efficient Decision-making,” Journal of the European Economic Association 18(6): 3006–3009.
13. John W. Patty, (2009) “The Politics of Biased Information,” The Journal of Politics 71(2): 393.
14. Recent news reports highlight competing AI chatbots from Meta and xAI (Grok) are still in the development stage (works in progress).
Reuters, (2026) “Meta Pushes AI Model ‘Avocado’ Rollout to May or Later, NYT Reports,” Reuters, 12 March. Accessed 15 March 2026 at https://www.reuters.com/technology/meta-delays-rollout-new-ai-model-nyt-reports-2026-03-12/.
Fred Lambert, (2026) “Musk Admits xAI ‘Not Built Right’ — Weeks After Tesla Invested $2 Billion,” Electrek, 13 March. Accessed 15 March 2026 at https://electrek.co/2026/03/13/elon-musk-admits-xai-built-wrong-rebuild-tesla-spacex-investment/.
