Learning
This project strengthened my ability to work across the full data visualisation pipeline, from raw data to final interactive product. I gained deeper experience in exploratory data analysis and developed a more structured approach to handling messy, real-world datasets, especially when working with multiple sources and performing data merging in Python. I also improved my ability to translate analysis into visual concepts and then into interactive implementations. Working with D3 and Svelte helped me better understand how to bridge the gap between prototyping and production-ready code. Another key learning was how to contribute effectively within a collaborative environment, especially when joining a project mid-way. I learned how to quickly align with existing work, communicate clearly with teammates, and identify where I could add the most value. Additionally, I developed a stronger understanding of the importance of consistency in data visualisation, not only in terms of chart design, but also in layout, structure, and storytelling. Defining elements such as titles, subtitles, and sources across all visualisations helped create a more coherent and professional final product. Overall, the project reinforced the importance of combining analytical rigor, design clarity, and technical implementation to create meaningful and engaging data-driven stories.
Impact
The project transforms complex and fragmented housing data into an accessible and interactive experience, making patterns in industrialised housing more visible and understandable. By combining data analysis with intuitive visualisations, it supports informed discussions around construction, urban development, and housing strategies. It also highlights how data-driven storytelling can help bridge the gap between raw data and decision-making in the built environment.
Challenge
One of the main challenges was joining the project at a later stage and quickly understanding the existing data, decisions, and technical setup. It required adapting to an established workflow while still contributing meaningfully to the direction of the project.
The dataset itself posed significant challenges. It contained missing values, inconsistencies, and outliers that needed careful investigation. Some anomalies required deeper analysis and discussion within the team to understand whether they were real patterns or data errors. This resulted in an iterative workflow where analysis, validation, and cleaning were closely intertwined.
Another challenge was the fragmentation of the data. In Spain, relevant data is distributed across multiple sources, which meant that additional datasets, especially for Barcelona, had to be integrated. This required extensive data merging and transformation in Python, including aligning formats, resolving inconsistencies, and ensuring the final dataset was coherent.
On the visualisation side, the challenge was to translate complex analytical findings into intuitive and accessible visual representations. This involved not only designing individual charts, but also ensuring that all visualisations worked together as part of a clear and consistent story.
Finally, maintaining a clean and coherent design across multiple visualisations required defining and applying a consistent structure, which became an important part of the overall user experience.
Description
This project was developed for StoryData, a data journalism agency in Barcelona, for the end client Construnews, a platform focused on architecture, construction, and data journalism.I joined the project at a later stage, after the initial data collection had already been completed by two colleagues. After providing general consultancy, my work focused on the tenders data, identifying patterns and defining how these insights could be translated into meaningful visualisations and a coherent data story. I conducted an extensive exploratory analysis of the data, examining different dimensions such as time, city, and distribution patterns. This included checking for missing values, identifying inconsistencies, and detecting outliers. Together with the team, we investigated these irregularities and iteratively cleaned the data. This became a continuous back-and-forth process between analysis and validation until we reached a dataset reliable enough for storytelling and visualisation. In parallel, I developed ideas for how to visualise the data and structure the narrative. Once we agreed on the visual direction, I started prototyping visualisations using D3 and then implemented them in Svelte, adapting them to the existing front-end framework that had already been set up. A key part of my work was the implementation of an interactive beeswarm visualisation. I developed its interactivity, including tooltips and intuitive filtering based on colour categories, allowing users to explore the data dynamically. Beyond that, I also contributed to the design and refinement of several other visualisations and supported their implementation. Additionally, I worked on defining a consistent layout system across the project. This included structuring each visualisation with clear elements such as title, subtitle, and source, and ensuring a clean and coherent visual design throughout the interface. Another important part of the project was integrating additional data from Barcelona, as the data landscape in Spain is fragmented across multiple sources. Together with Carlos, I worked on merging and harmonising these datasets using Python, ensuring consistency and compatibility across sources. Overall, my contribution spanned the full pipeline—from data analysis and cleaning to visualisation design and front-end implementation, always in close collaboration with the team.
Topics
Data Analysis, Exploratory Analysis, Data Cleaning, Python, Data Visualisation, Svelte, D3, BeeSwarm, Interactivity, Front-end Development, Data Journalism
Tools
HTML, CSS, JS, d3.js, svelte, git, python
Year
2026
Clients
StoryData (Barcelona) for Construnews
