A senior U.S. Air Force officer raised alarms when she said that “Twitter feeds of randos” are tracking U.S. aircraft movements, creating operational security concerns.
Maj. Claire Randolph, chief of weapons and tactics at U.S. Air Forces Central Command, said civilians can collect and consolidate aircraft movement data online, commonly called OSINT, or open-source intelligence.
“If U.S. analysts did some of that,” Randolph said, “we would consider it secret or maybe even top secret. But that stuff is just allowed on the open internet. So operational security … is really, really difficult to conceal.”
Randolph spoke on Feb. 9 during a session billed as “Midnight Hammer: Lessons Learned” at the Mitchell Institute for Aerospace Studies.
In an essay published this week, OSINT analyst Zaki Khalid said Randolph’s remarks show how open data and online communities can complicate military operational security in what he called a “highly informationised” environment.
“It is very difficult for any nation-state to exert total information dominance or manage information flows completely,” Khalid wrote.
Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.
Point phone camera here
Public data, real-world risk?
Khalid, who leads Pantellica, Pakistan’s first registered OSINT training and consultancy firm, told Straight Arrow News that one of the underlying problems is that what used to be niche, proprietary information is now widely available.
“Today, any person or object could be a ‘sensor’ serving as a data point for exploitation,” Khalid said, arguing states can no longer count on information dominance as they once did.
Some critics said the bigger issue isn’t the analysts, it’s that widely receivable tracking signals exist at all, and that aviation, maritime and space tracking also serve routine, non-military uses the public often overlooks.
“Yes, there are analytical enterprises focused on keeping track of military developments,” Khalid said, “but these assets are equally relevant to sectors as diverse as agriculture, geology, hydro surveys … Everything need not be militarized.”
The dispute hinges on two basic questions: What counts as OSINT? And when can analysis of public data create real-world risk?
Those concerns extend well beyond hobbyist accounts and have been on the Pentagon’s radar for years. In 2023, the Air Force told National Defense Magazine that OSINT is a “direct threat” to military air operations worldwide.
Robert Spalding, a retired Air Force brigadier general, warned that open-source data and social media create operational risks.
“That’s why I left the military,” Spalding told the outlet Israel Hayom. “Because we are not protecting data. I think it’s the most dangerous thing we can do as a nation.”
Why the OSINT fight over data matters
Militaries, journalists and the public are wrestling with a growing question: When information is legally public, how much harm can come from organizing, analyzing and reposting it?
OSINT specialist Nico Dekens told SAN that the practice relies strictly on publicly available information, including government records, social media posts, commercial datasets and news archives. Dekens, a former Dutch law enforcement intelligence analyst who now serves as chief innovator at the software firm ShadowDragon, explained that the discipline focuses on turning those raw data points into actionable timelines and risk assessments.
In a ShadowDragon explainer, Dekens described how organizations use OSINT across cybersecurity, law enforcement, business intelligence and journalism.
SAN reported earlier this month on how publicly available flight-tracking helped document major U.S. airlift movements, including a surge of C-17 transport aircraft into Europe and the Middle East.
That story stressed that the movement data came from civilian feeds, not official deployment announcements, and that the Pentagon had not publicly tied those flights to any named operation.
In a separate op-ed, OSINT expert and special operations veteran David Cook argued that what counterterrorism professionals call “leakage” — public traces of grievance and intent — can give law enforcement advance warning of lone-actor plots.
He said those open signals, drawn from publicly available posts, viewing histories and purchases, can help security teams allocate resources to “soft targets” before a planned attack, not just reconstruct what happened afterward.
Where OSINT stops and hacking begins
A key point in the debate is where OSINT stops.
Dekens emphasized that legitimate open-source analysis ends when a user must break the law or trick a system to compile the data. He noted that using malware, exploiting software bugs or relying on stolen passwords and breached databases falls outside the definition of OSINT.
“OSINT is not hacking, exploiting vulnerabilities, malware or breaking into systems,” Dekens said.
He described a practical rule of thumb: If someone “must log in without proper permission,” bypass paywalls or access controls, or “trick someone” to obtain non-public information, then “you’ve left OSINT.”
In other words, he said, the dividing line isn’t whether the material feels sensitive — it’s whether any normal person can access it as intended, without tricks, special access or bypassing controls.
When plane-spotting becomes a problem
Public aircraft tracking can be legal and still pose risks, Dekens said, depending on context.
A tracking technology known as Automatic Dependent Surveillance-Broadcast Out, or ADS-B Out, broadcasts an aircraft’s position, altitude and speed about once per second. That data is intended for wide reception, including by air traffic controllers, which can create operational security risk when analysts track it, Dekens said.
For many routine flights, the risk is low because the signal is part of normal air traffic surveillance.
Risks rise when analysts track sensitive movements — VIP travel patterns, visible military or government aircraft, covert logistics or tail numbers that identify a specific person, unit or company.
Dekens said the larger concern often isn’t the raw broadcast itself but the inferences that can be drawn when aircraft data is combined with other public information — what’s called the “mosaic effect.” He described that concept as “harmless tiles + aggregation = sensitive picture,” where separate public details can become more sensitive when assembled into a larger narrative.
Khalid said military operational security planning increasingly has to adapt to “undesirable battlefield transparency,” and focus on improving military deception rather than relying on information control that may no longer be realistic.
What ’responsible’ sharing looks like around military activity
The debate has also prompted questions about norms for sharing information about military activity — not just what is legal to collect, but what is wise to publish.
“The fact that one can collect and analyze publicly available information does not imply that one should publicly share the analysis or intelligence,” Dekens said, particularly when it could create real-world harm.
Dekens urged OSINT users to avoid posting actionable details that could put personnel at risk, including routes, ETAs, call signs, tail numbers and “pattern of life” information, and to focus on higher-level context rather than real-time “tactical tracking.”
He suggested a basic test for dissemination: “Could this enable targeting or interdiction?” If yes, he said, OSINT practitioners should generalize, delay or refrain from sharing.
OSINT has its advantages, but Khalid warned it can also create problems when shared recklessly. Practicing OSINT “for pure adrenaline rush or commercial motives” can lead people to overlook downstream consequences, Khalid said.
How commercial OSINT tools build in guardrails
That question of responsibility doesn’t stop with individual accounts; it also applies to private-sector OSINT platforms that can automate collection, correlation and reporting at scale.
Regarding his own firm, ShadowDragon, Dekens said its tools do not circumvent encryption or privacy settings and collect only lawfully accessible public data. He added that training for ShadowDragon employees emphasizes boundaries around publicly available information, validation and “responsible,” ethical and lawful application of findings.
“Private tools shouldn’t turn public crumbs into push-button targeting, stalking or mass surveillance,” he said.
Companies can build guardrails, he said, through acceptable-use rules, customer screening, role-based feature limits with audit trails and “safety by design” defaults that make dragnet-style use harder.
What the military can change — and what it can’t control
If information is truly public and lawfully obtained, Dekens said, the military’s direct recourse against outside individuals is limited because analysis of public information is generally protected, except when it crosses into unlawful conduct.
He said recourse is often more practical through reducing exposure than trying to punish commentary.
Dekens said the military could limit the dissemination of certain flight data and could use privacy International Civil Aviation Organization addresses that use temporary IDs not attributable to a specific aircraft. He also cited platform levers, such as takedown requests for posts that violate rules on doxxing, threats or harassment.
What’s next as public data grows
In his essay, Khalid argued that the broader answer is adaptation. He wrote that modern military planning should assume adversaries and analysts will use open data creatively, and he urged militaries to respond with better deception practices and more flexible decision-making rather than relying on restrictions that may be difficult to enforce.
The debate is likely to persist as public data sources proliferate and analysis becomes easier to automate. The central question is shifting from whether information is public to how quickly it can be assembled into actionable insight — and who bears responsibility for limiting harm when the underlying “tiles” are legally accessible.