
In today’s fast-paced transport networks, real-time data serves as the lifeblood that ensures smooth and efficient operations. The increasing demand for accurate and timely information has elevated the importance of reliable data handling to unprecedented levels. Even an occasional error can lead to significant operational disruptions, financial losses, and damage to reputations.
Transport authorities and operators often modify or extend the base data standards for specific local requirements, which leads to various interpretations and implementations across different networks. In doing so, they face a dilemma: adhering strictly to standardised data formats can limit their ability to address unique operational needs, but deviating from the standards can lead to inconsistencies, interoperability issues, and vendor lock-in.
It therefore seems pertinent to address this challenge as early as possible. By finding a balance that leverages the strengths of standardised data while accommodating local customisations.
The solution to this dilemma is to use a standard in its purest form, augment it with complementary tools (rather than amending the standard) and to utilise modular open ecosystems that tie into those standards. Not only does this allow for systems to be customised to the local requirements, but the standard remains unchanged and there is no agency lock-in. Furthermore, there is less chance of inconsistencies and interoperability issues, key factors in the performance of a transport network.
This white paper explores the challenges posed by the adoption of flexibility in data standards like GTFS and GTFS-Realtime (GTFS-R) and offers a solution to leverage these standards effectively without sacrificing local requirements.
Flexibility in Standards Leads to Inconsistencies and Supplier Lock-In.

The General Transit Feed Specification (GTFS) and its real-time counterpart, GTFS-Realtime (GTFS-R), were developed to standardise how transit data is shared and consumed. While these standards have been instrumental in enabling interoperability, they are not a ‘certified’ standard; flexibility in their use often occurs to accommodate a wide range of operational contexts. This flexibility, however, leads to various interpretations and implementations across different networks.
Many transport authorities and operators modify or extend the base standards to suit their specific needs. While this approach addresses local requirements, it results in data formats that deviate from the “pure” specification. Consequently, third-party developers and integrators face challenges in consuming this data, leading to increased development costs and potential errors. Moreover, this customisation can inadvertently lock authorities into specific suppliers who implement these modified standards. Over time, these suppliers become legacy agents, making it difficult and costly for authorities to switch providers or adopt new technologies. This vendor lock-in stifles innovation and can leave systems reliant on outdated or unsupported software.
How can we utilise the power of standards without losing the benefits of local requirements?
Our solution involves a three-pronged approach:
• Decide on and use a standard in its purest form (and enforce it)
• Augment the standard with complementary tools, not amendments to the standard, and
• Design modular open ecosystems to tie into those standards, avoiding proprietary supplier lock-in.
1. Decide On and Use a Standard in Its Purest Form (and Enforce It)
Committing to the pure implementation of established standards like GTFS and GTFS-R ensures consistency and interoperability across systems. By adhering strictly to these standards, transit authorities and manufacturers can facilitate seamless data exchange with third-party applications, partners, and other stakeholders. There are several other key benefits:
- Interoperability is enhanced because uniform data formats enable different systems and applications to work together without custom integration efforts.
- Development costs are reduced since developers can build applications and tools that function across multiple systems without accommodating bespoke modifications.
- Standards are often supported by a community of developers and organisations, providing shared resources and best practices.
To ensure compliance, organisations can implement policies that mandate the use of standard-compliant data formats and ensure compliance requirements in contracts with suppliers.
While strict adherence may seem limiting, many standards are designed with extensibility in mind. Although a quick-fix to a problem may be to deviate from a standard, we argue that it never makes sense in the long-run.

2. Augmenting the Standard with Complementary Tools, Not Amendments to the Standard

When unique local requirements arise, instead of modifying the standard itself, organisations can develop complementary tools and extensions that operate alongside the standard. This approach preserves the integrity of the core data format while providing the necessary functionality.
Extending capabilities without altering the core standard can be achieved through various implementation strategies:
- Metadata and annotations allow additional information to be attached using standardised methods, which can be ignored by systems that do not recognise them without affecting core functionality.
- Developing Application Programming Interfaces (APIs) or middleware can translate or augment standard data for specific applications, keeping the core data unchanged.
- Designing modular components that can be added or removed without altering the base system provides flexibility for local needs.
This approach maintains compatibility because core systems and third-party applications continue to function correctly with the standard data. Flexibility is enhanced, allowing organisations to address local needs without disrupting the broader ecosystem. Scalability is improved since extensions can be updated or replaced independently of the core standard.

3. Designing Modular Open Ecosystems to Tie into Those Standards, Avoiding Proprietary Supplier Lock-In

Creating an open ecosystem with modular components allows organisations to integrate various tools and services without being tied to a single vendor. This approach fosters innovation and flexibility, enabling organisations to adapt to changing needs and technologies.
Key principles involve:
- Designing systems with interchangeable components, allowing for easy upgrades and replacements
- Using open standards and protocols ensures that components can communicate effectively
- Selecting vendor-agnostic solutions reduces dependency on specific suppliers.
Maintaining ownership and control over data through open standards compliance and transparent procurement processes prevents vendors from restricting access or portability.
This strategy encourages innovation, as an open ecosystem fosters the development of new solutions and services. Cost savings can be realised because competition among suppliers leads to better pricing and terms. Resilience is enhanced since modular systems can adapt more easily to changes, reducing the risk of obsolescence.
Consider a transport operator that implements an open-source platform for real-time data management, or a series of individual components that ingest and use GTFS data. By adhering to GTFS and GTFS-R in their pure forms and building modular extensions for local needs, they can integrate various tools such as passenger information systems, analytics dashboards, and mobile applications all from ‘best in class’ suppliers. This setup allows them to switch vendors for different components without disrupting the entire system, ensuring long-term sustainability and flexibility.
Conclusion
Balancing the power of standardised data formats with the necessity of addressing local requirements is crucial for the success of real-time data systems. By deciding upon and enforcing the use of standards in their purest forms, augmenting these standards with complementary tools rather than altering them, and designing modular open ecosystems, organisations can reap the benefits of both uniformity and customisation.
This approach mitigates the risks associated with inconsistent data formats and vendor lock-in while fostering an environment where innovation and adaptability are possible. Ultimately, it enables organisations to deliver reliable, efficient, and high-quality services that meet both operational needs and customer expectations.
We would love to hear your thoughts on this, drop us a line!