The Hyperlayering Architecture: A Journey from Failure to Functional Portability (Full Presentation Script - English) I. Introduction: The Problem of the Silo (P1-P7) P1: Title Today, I want to introduce you to the Hyper-Layering Architecture for Web mapping, explaining what it is and what significance it holds for the development of the world wide web. This architecture is closer to a philosophy for the information society than a simple technology, and we believe it directly drives the W3C's vision for a "truly decentralized, user-first Web." P2: AC Meeting Context The content I will present today is a deeper dive into the topics discussed at the 2024 AC Meeting. P3: The Critical Challenge & Silo Problem Despite the fact that a vast amount of information effective for disaster preparedness is publicly available on the Web (in the form of maps) we are unable to freely and easily synthesize (overlay) them onto a single map. This is widely recognized among disaster information experts as a major systemic challenge for the world wide web. This can be described as the well-known Silo Problem in the information systems industry. And what is fundamentally lacking there is interoperability. P4: The Mashup Solution, 20 Years Ago As many of you know, the mechanism to overlay and visualize various geographical information onto one map, called Mashup, was already in practical use around 2004, and was a representative feature of the Web 2.0 era. This was achieved by an aggregation service server, or cloud computing, as shown in this diagram. Therefore, the question is: Why wasn't this problem solved 20 years ago? We will explore that now. P5: The Flaws of Centralization Let's look at the problems of Mashup. The service at the center becomes the choke point for interoperability, and it must collect all the necessary data. Especially during a disaster, it requires ad-hoc processing and massive access handling. This comes with immense cost and creates a single point of failure. Furthermore, the fierce competition among service providers who gained dominant positions and vested interests ironically formed new silos. Additionally, aggregation services require information replication and redistribution, involving complex adjustments of legal rights. This directly leads to the problems of rights infringement. Centralized Web 2.0 holds a structural flaw that endangers the very survival of the World Wide Web, and the time for transformation is urgent. P6: W3C Shared Concern This awareness is not only shared by us, but also by the W3C. Referring to the Operational Principles of the W3C Vision, the principles of Avoiding Centralization and User-First clearly align with the structural defects we pointed out. The very fact that these must be explicitly stated in the Vision suggests the World Wide Web has reached a critical state. P7: Revisiting the Original Web Let's look back at the original Web, over 30 years ago. The world wide web implemented by Tim Berners-Lee was founded on an extremely simple core philosophy: "hyperlinks via URL" and "integration by the user's browser." This basic philosophy achieved interoperability from the outset. No aggregation service exists here; it is autonomously decentralized. Surprisingly, the numerous problems that plague mashup services simply do not exist in this architecture. II. The Initial Failure (P8-P14) P8: The Birth of Hyperlayering The Hyperlayering Architecture inherits the architecture of this original world wide web and evolves it for the map medium. It was conceived in 1996, sparked by discussions regarding the Great Hanshin Earthquake. P9: Hyperlayering Basic Behavior (Animation Focused) Let's look at its basic behavior. The distinct feature is the absence of an aggregation service. A Link Collector publishes the URLs of various layers. First, when the user selects a layer, the browser follows the hyperlink to access and display the content. Unlike ordinary hypertext, the Link Collector page remains maintained. Next, the user selects another layer. That layer is then overlaid onto the map. This "layering utilizing the hyperlink" is Hyperlayering. Behind the scenes, the user is autonomously integrating information from distributed sites on their terminal. This is the ideal Web, based on the principles of 'Avoiding Centralization' and 'User First', as drawn in the W3C Vision. However, cloud based map services subsequently became widespread, and the Hyperlayering Architecture's vision became increasingly unrealistic. Next, I will introduce the process of that setback. P10: Standardization Failure We spent the past quarter century attempting to construct specifications based on SVG, which seemed well suited for mapping and expected to be browser native, and we pursued its implementation, standardization, and practical use. However, standardization stalled. We achieved standardization within Japan as JIS X.7197, but domestic standards hold little influence. On the other hand, we participated in the W3C SVG Working Group but have not yet achieved standardization. This has multiple causes. First, Web browsers focused on enriching features for individual service experiences, and the fundamental Web principles of information integration and interoperability were neglected, seen as the cloud's responsibility. Also, the long years required for SVG standardization and browser implementation was a contributing factor. P11: The Conflict of Practicality Next, let's look at the struggle for the practical application of the Hyperlayering Architecture. After we established its fundamentals around 2000, cloud computing and server based mashups emerged and spread widely. In this situation, was the ideal envisioned by the Hyperlayering Architecture possible? The answer was, "Impossible." P12: The Impossible Layering We could create a link collector to map service sites, and by selecting a link, one could transition to an individual map site. But that was it. The crucial layering was impossible. P13: The Simple Reason The reason is simple: The original Hyperlayering Architecture presumed the establishment of a common map format on the Web. We had hoped for SVG. But the reality was different. P14: The Rise of the Dummy Terminal The reality is that data formats are scattered (GeoJSON, proprietary CSV, PNG tiles, etc.). Furthermore, with the evolution of HTML5 and modern browsers, advanced processing like information integration shifted to the server side. The Web browser became an "isolated, proprietary dummy terminal" running its own specialized Web App. As a result, users lost the right to freely move their data and experience, and the browser lost its role as the engine of Web interoperability. III. The Breakthrough: Layers as Web Apps (P15-P18) P15: The Catalyst for Change In this situation, we experienced the Great East Japan Earthquake in 2011. The problems with cloud computing became evident, and yet, the value of the Hyperlayering Architecture could not be discarded. The search for a way to solve the failure began. From that necessity, Layers as Web Apps was born. P16: LaWA Architecture (Animation Focused) Layers as Web Apps views the Web Application itself, composed of JavaScript, HTML, and CSS running in the browser as a layer, not just content or data. A new server delivering Layers as Web Apps is added to the diagram. On the client, we run a framework software called svgmap.js, which has the mechanism to plug in and execute these Web Apps as layers. The Link Collector links to the Layers as Web Apps using URLs, as before. When the user selects a layer, the svgmap.js framework follows the link, loads, and executes the code on the client. The Layers as Web Apps accesses data in its own unique format, converts it into the displayable format for svgmap.js, and displays it as a layer. The Layers as Web Apps also implements its own unique UI (e.g., a date picker for weather forecasts). This mechanism is an approach toward the redefinition of interoperability based on Web fundamentals: strategically utilizing the advanced Web Application runtime environment for client side information integration, free from specific services. This resembles microservices, but the synthesis of the output onto a single map is its unique feature. However, a different Layers as Web Apps must be prepared for each map service. Who will prepare this vast number of Layers as Web Apps? This is the challenge. P17: Implementation and Open Source Just as the Web browser is fundamentally open source, our environment, which we call the "browser for maps," is also being implemented as open source. We addressed the challenge of service specific Layers as Web Apps by developing a massive number of them through sheer effort. We have achieved the integration of over 1,000 layers. This figure is arguably the world's largest integration of disparate map information services. P18: Realization and Call to Action After a long journey, the practical application of the Hyper-Layering Architecture is happening. It has been in internal use at KDDI for over 15 years, during which time a large number of Layers as Web Apps were developed, and the framework matured. External practical use is now beginning. KDDI's service, the Disaster Map Board, utilizing the Hyper-Layering Architecture, has been Commercially Launched since October 2025. At this TPAC breakout session, we would be delighted to gain strategic collaborators within the W3C. We are not just looking for map technology experts, but for those who are fundamentally interested in Avoiding Web Centralization and promoting User- and Browser-Centric Interoperability—the very principles established in the W3C Vision. Thank you. Now, to truly demonstrate the power of this new philosophy, let's proceed with a live demonstration of the Hyper-Layering Architecture in action.