Our industry is now experiencing the explosive kind of growth that has only happened twice before: with the electrification of towns and the rollout of copper telecommunications networks.
We are now seeing a fiber network explosion driven by significant demand for bandwidth. But our network design capability isn’t prepared.
We face the mounting pressures of surging demand for faster rollout of larger networks, stiff price-based competition from offshore engineering firms, and the fact that today’s network design effort is increasingly distributed not just between teams, but among different organizations.
Yet network design approaches have not fundamentally changed in decades.
Growth in the number, scale, and complexity of the networks being designed requires a corresponding shift in the way we approach the design task itself.
Better network “drawing” software isn’t enough. Geospatial data and data analysis tools aren’t enough. Throwing more engineers at the task isn’t enough.
A technological solution that removes human limitations from large-scale, complex network design is required to ensure we can meet these challenges as an industry.
Using big data for better design
The current network design approach may be decades old, but today’s technology has brought our industry a level of geospatial capability that we couldn’t have dreamed of even five years ago.
Today, we have at our fingertips a structured, detailed, highly accurate view of where things are — buildings, utility lines, and other infrastructure. While that makes things clearer, the sheer scale of that detail for a large network can drastically complicate the design process.
Unless, that is, we use that information to create a model for designing the “best” network.
The model is mathematical, deterministic and exhaustive, so it considers the whole data set and provides consistent answers based on the engineering and business rules, costs and project constraints.
The model isn’t just software that analyses or organizes information. It’s creative. It does what was traditionally human “thinking” work to produce an entirely new output: the best possible design.
The trouble with traditional design
Consider giving the same design project to five different engineers. What would be the result?
Since each engineer has different experiences, understandings, and preferences, you’d end up with five different designs. Each design would serve the demand and comply with the engineering standard, but each design would have a different cost, and none of them would be the best cost.
Human minds can’t take on an entire large-scale network design all at once, so we chunk the task, optimizing the design for sub-areas or zones within the network, finally connecting them to form the whole. This approach can provide local cost optimization for each area, but can move away from lowest total network cost.
A good way to think of it is a bit like using a ghostwriter as opposed to writing an entire book yourself. With machine led design, an engineer can focus on telling the machine what they want (business and architectural rules) and then let the machine lead the way in generating the design. Importantly, engineers can impart their wisdom to the machine, so that less technical users can generate designs.
Better design, faster
An algorithmic design model can tackle a large design area all at once, reliably gaining “global” optimization network efficiencies that are close to impossible for humans to achieve unaided.
It will consistently provide what is objectively the best solution given the data and constraints provided.
And it will produce each design much more quickly than traditional approaches.
Instead of spending hours manipulating data, checking and correcting errors, and so on, engineers can instead apply their professional experience and other contextual requirements to the design produced by the model.
The end result is a combination of machine-assisted and intelligent human design that’s truly optimised for the network design tasks we’re facing today.
It is algorithmic, machine-assisted design that will allow us to meet the network rollout challenges we face not just today, but tomorrow as well. This kind of technology is part of an end-to-end design data model and approach that moves beyond simple automation. This is the cornerstone of the digital disruption of traditional network engineering.
It allows us to design better networks faster.
Where to from here?
If you couldn’t join Paul Sulisz a few weeks ago, I suggest taking a look at the webinar he presented “Taming the Machine for OSP Engineering Success.”
If there’s one thing that you can do to get started, don’t be afraid to challenge the traditional approach. There’s brilliant people doing amazing things across the industry.
We’re always happy to have a chat and in the meantime stay tuned for more blogs, webinars, and videos from us!