Data-Driven Design: The Backbone of Your FTTH Project - Part 3
- Lisa Mayo
- July 10, 2024
Whether your goal is to quickly evaluate a new network build project, or to fully engineer a network for the purpose of constructing it, a realistic design should be your starting point. However, initiating the design process is impossible without input data. The quality of a design can only be as good as the input data used to generate it.
At Biarri Networks, we specialize in data-driven design, by collecting, generating, combining, and transforming various input data sources into fully engineered fiber networks.
Throughout each part of this article series we will cover:
- Determining your data requirements: How design fidelity drives data requirements.
- Sourcing data: Understanding the data sources available as input to network designs.
- Data processing: A quick look at Biarri Networks’ sophisticated data processing tools and their applications.
Today, in part 3, we will deep-dive into the final topic:
Data processing
Once all design data has been collated, it needs to be reviewed and processed for use in the design. This can be an enormous task when faced with many sources of data of varying levels of quality. To efficiently solve this problem for large datasets, automation is the best approach.
Over the years, Biarri Networks has developed various software tools to automate the cleaning, generation, and augmentation of GIS data at scale. These tools include the capability for:
- Cleaning of existing aerial and underground data records
- Cleaning of common GIS data issues like duplicates and invalid geometries
- Simplification of messy/cluttered data sources
- Snapping, splitting, and inferring missing records
- Generation of new data, such as aerial or underground networks
- Generation of new aerial infrastructure, following side-of-road placement guidelines and preferred span lengths
- Generation of dual-sided or single-sided greenfields underground networks with road crossing patterns of choice
- Combining of address record sources, and interpretation of their fiber requirements for any location type
- Carving and merging datasets
- Optimization techniques for
- Determination of service lead-ins to serviceable locations
- Augmentation of greenfield and brownfield assets into a complete dataset
- Ingestion of existing design records, including splice records, for determination of available capacity within cables and devices; this information can be used as the basis of network expansion design projects
These data processing tools are used extensively on every single one of our design and engineering projects, guided by our expert design team. In fact, in the last year:
- At least 12% of the automation tools run by our design team were related to data preparation (the remainder includes design tools, splicing tools, and other processes).
- We have spent at least 27.8 days (about 4 weeks) running data processing automations in the last year.
- Most of our activity was spent on processing address data (17%), followed by general data cleaning (16%), lead-in generation (14%), data filtering (13%), and access to data such as addresses, streets, and parcels from our “base data” database (7%).
- Most of our time was spent generating optimal service lead-ins (this continues to be a particularly challenging automation problem to solve) and carving and merging large datasets.