Product Management Coaching Practice

Wow, it’s been a long time since I’ve written on here. My focus has been on sharing my product management experience through my coaching practice at CPO.ONE

Some fundamentals that are helpful to both new product managers as well as seasoned product leaders at unicorns are pointed out below:

  1. Product Management is hard, especially when done correctly.

  2. Agile is not a product management framework (Jeff Lash)

  3. Product is not what you ship, product is what you measure.

  4. Measure to learn, build to measure.

  5. Invariably, solution competence first requires problem competence.

Sustainable Business Success

 

Sustainable success in business is the dutiful combination of data-informed opportunity multiplied by the investibility of the founders.

by Lance G. Douglas

 

Summary

Many books have been written, and many great frameworks exist, to create a business, it’s products, it’s branding and marking, it’s processes, it’s business model, it’s financing and IPO, etc. I feel like I’ve read most of them, but I’ve certainly read enough to know that they commonly fail to culminate the end-to-end story into how to build a great business, which is sustainable for customers, staff, investors, and founders.

I’ve been working on an understanding, through study, experience, failures, and prayer of what makes success possible, likely, achieved, and sustained for a little over 10 years. The following is the summary of my thesis that has, only as recent as a couple days ago, become finally tangible and resembling complete.

Why, and How, Now?

The lenses that I collected and gained along the way through life have given me a lot of diverse perspective that keeps me well grounded, empathetic, and hopeful. I spent a lot of my life solving problems; I considered myself to be an outstanding disruptive innovation architect (before ever even learning of Christensen), that could deliver solutions extremely quickly and with far too few resources to ever achieve anything and my health at the same time.

I attribute my successes to the fact that I could really understand the solution being asked for and dive right in and know of the simplest, more direct route to a solution that was both relevant and future-architected for growth and extensibility. I look back and can tell you that 110% of the times, I was pulling a solution towards a problem, and that is insatiable.

Recently, I’ve been lucky enough to have some downtime to self-reflect and rejuvenate near the mountains. During the past couple months, I’ve come to realize that I’m not intrinsically motivated by being a “solution architect”, which is the implementation of the solution, even though I had thought was the design and strategy of solutions.

I am intrinsically motivated by
“problem science followed by solution science”


I am intrinsically motivated by “problem science followed by solution science”. The main difference being that and the former, is that Solution Architecting pulls solutions towards the problem, and the latter sciences are pushing a problem towards a solution. It’s subtle, but this paper explains it in detail.

Combine the current situation with my passion of helping others achieve their passions, and that has driven me to re-evaluating all of my perspectives through this new lens. I began formally studying Product Management at ProductSchool.com and that really put everything within the correct sets of lenses for me.

Finally, I set out to validate a hypothesis that angel and seed investors have a low seed hit-rate of only 10% and would be interested in paying for a product that I could deliver them: 2x-10x improvement on their early stage investments hit-rate, and with a second side of the market being those founders wanting to become both funded and sustainably successful. I summarized this into a complex well working system of the combination of data-informed opportunity multiplied investable founders.

I spoke with several family, VC, and angel fund managers during my problem science phase.

The MVP: a start-up development process that resulted in the combination of data-informed opportunities, at revenue, with investable founders, with a standardized scoring system tuned to the investors’ risk preferences.

Problem Science followed by Solution Science

The combination of the two, to me, is the fundamental definition of Product Management. More importantly, Problem Science is only ever followed by Solution Science.

The outcome of this combined process is only ever:

  1. A problem not worth pursuing; or

  2. A problem worth pursuing + a solution not worth pursuing; or

  3. A problem worth pursuing + solution worth pursuing.

It is critical at this point is to understand that a solution is purely an equation defining what changes were tested to move the metrics, to what impact and value, and via which methods.

Problem Science is the process of gaining insight until you feel confident that you understand:

  • the hypothesis;

  • the problem’s root cause(s) and for whom;

  • the impact of the problem and to whom;

  • quantification of how problem manifests and to whom;

  • qualification of the value of the problem being solved and for whom;

  • assumptions taken and/or unknowns;

  • SWOT analysis of understanding of root cause(s), impact, and value;

  • the metrics needed to measure the understanding on a continuous basis; and

  • one or more prioritized equations that clearly articulate the conclusion(s), favourably or otherwise.

Solution Science is the process of attempting to discover a favourable means to transform favourable problem-insight into targeted value(s) by:

  • a hypothesis of the simplest possible introductions into the Problem Science equation(s) will move the metrics towards the desired value, only to the degree of a minimum lovable product, for the best persona’s to target right now;

  • highly targeted test employing the simplest approaches to exemplify, quantify, and qualify those introductions;

  • SWOT/GAP analysis of total set of problem equations; and

  • revisiting of the Problem Science process with all gained data; or

  • one or more prioritized equations that clearly articulate the conclusion(s), favourably or otherwise.

 

Start with Why - Simon Sinek

Sinek’s decade-old video is still just as inspiring and informed for me.

His discovery on how the world works, and his codification of the biological aspects which drive great leaders, is greatly simply. However, in it’s simplicity, it is not a shortcut.

Knowing your why, how, and what is not a recipe for success. Sinek’s circle drawing is a summation of the reflection of what can be a masterfully crafted opportunity helmed by focused leaders.

 

The Problem with Why

Business+Science_1.jpg

In the same way that Sinek states that “profit is always a result, never a goal”, I believe that a company achieving the simplicity portrayed in his codification is only a result of dutiful of design, construction, measurement, and management of an opportunity, and only when combined with founders well prepared for their part in the journey to success. What I’ve come to realize is that the journey to the clean, simple, concentric “ Why, How, What” that Sinek depicts is not created in the same outward progression as drawn.

That direct approach is where companies and founders make their core mistake: taking a inspiring “why”, coming up with a great “how”, and then making a “what”.

You can sum up the common journey as often witnessed:

  1. Why (Vision): We believe in making this big impact in the world.

  2. How (Mission): We’ll demonstrate our beliefs by this powerful approach.

  3. What (Product/Service): Here is what we do to convey our how to you.

The problem with this approach is that it is far to simple to fake it. Fake the dutiful efforts required to properly organize both the founders and the opportunity for success, and more importantly, perpetual success.

I believe that this problem is firmly exemplified in the statistics of early-stage investments. Approximately 10% of seed rounds return the investment, and the horizon is 7-10 years.

When talking with angel, family foundation, and seed fund managers, the average rate for the post mortem cause of failure being “limitations of the founders” was an astonishingly high 67%. While that is still a bit subjective at this point, you have to take into consideration that the founders’ past successes barely moved the needle on future success. Was it more likely just plain luck to be a successful early stage investor? Maybe, but why? I wanted to understand the root.

 

Investable Founders & Data-Informed Opportunities

I set out recently to understand the root cause of the early-stage hit rate being a measly 10%, with a hypothesis that I would need to increase that hit rate to somewhere above 20% to create value. I have been through many startups, as well as fostered transformational changes in large enterprises, so I broke down the positive, negative, and missing pieces in each, the shortcuts if you will, which lead to the shortcomings that I witnessed (or performed). I then began researching well known failures and successes. I was looking for a common thread, and I found one.

Ideas are worthless, it’s execution that matters; but execution without dutiful attention to opportunity stability will inevitably result in failure. This rings loud and clear as a nod to John Gall’s famous axiom.

A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
— John Gall

My first realization was that often the shortcut taken is a lack of good appreciation and execution of Problem and Solution Science. It may be a lack of knowledge of product management practices, or simply a disdain for process. So, how do we counteract that? What is the root cause and factors that drive people to short-cutting their way through execution, inevitably to achieve failure?

In my research, I laid out the simple well working systems that combine to make the complex well working systems of Sinek’s “What. and “How”. That is to suggest, I discovered the Sinek’s diagram is not simple, it is complexity cubed. It’s a well working complex system made up of no less than three well working complex systems, each of which are made up of their own complex well working systems, and so on. I sought to unravel the complexity down to it’s core simple well working systems.

Hint: the resulting Solution Science hypothesis was I could automate the execution and gating process of evolving the simple systems into the necessarily effective approach that Sinek arrived at. The designed MLP was to use an automated set of scales to guide founders into problem interrogation and discovery and then force their solution to be derived from the qualified value (both corporately as well as to customers) that the Problem Science has isolated. This was nearly identical to the SCORM-delivered logic-tunneling system I architected and engineered two decades ago, so I had an advantage and/or blindspot to the approach that I had to watch out for.

Next, I shared my premise of the problem (not solution) with a successful investor friend of mine and the response was positive, but they gave a hint to an additional complementary factor in the structure of the root cause; founders’ limitations of getting out of the way of a good opportunity, where those founders are anchoring down the ability to achieve the next stage of growth and/or sustainable success.

Reading between the lines of this new information, I returned to the Problem Science to iterate some more seeking to quantify and qualifying what makes founders not investable, and what was the root cause of that. Perhaps, as my hypothesis now indicated, if I could solve for the combined root causes of opportunity and founder inadequacies, that would move the needle for customers, measured by the increased rate of success of both founders and investors.

I was able to isolate five key measures that investors use for qualification of an early-stage investment, and also build an algorithm to match their fund’s risk profiles to a customized “investibility score”:

Picture2.png
  1. DScore: Defensibly

  2. FScore: Founding Team

  3. GScore: GTM Strategy

  4. MScore: Market Situation

  5. RScore: Regulatory/Privacy/Social Complexity

  6. IScore: Investor-Specific Holistic Risk Profile Matching

While this is a great notion of what problems, and value, that Investors could realize by the isolated problems I’ve interrogated, I still had to figure out a way to quantify and measure those factors in founders and their ideas. So I set out to test the underlying simple well working systems that I could use to gain the necessary insight into driving those metrics (scores).

 

Iterating Through Solution Science

All to often, experts and novices alike have a passion for a solution that they’ve been conjuring up. Then, once some hint of a problem is witnessed by these well-wishing founders, one which looks like a target that they could throw their solution at, they work to hone the problem to look more and more like “thee target of their solution”. They are actually trying to justify the solution that they love, and their passion for the solution can often be confused with passion for the problem.

That situation is a purely emotional one, and easily primed for overindulgence, and unfortunately a real possibility for unethical decisions[1], consciously or unconsciously. How do we counteract that? I set out to find out, and my General Hypothesis was that core drivers of the founders’ “Why” could be unraveled methodically and reasonably to rebuild it back up with simple well working systems. This would be measurable by the compounding value realized of founders, and their companies, that are better stewards of a problem, investible, and long-term sustainably successful.

The latest iteration of Solution Science explored the manipulatable levers of the root causes, and value impacts, and began with hypotheses of:

  1. organize the process of capturing and objectifying data via simple well working systems into measurable equations of causation, state, influences, and value, but founders are typically dealing with relatively subjective and/or highly-biased data;

  2. realign founders’ complex unmeasurable motivations, positions, and situation into simple well working systems that support the complex well working system of Problem Science followed by Solution Science, but we’re dealing with humans; and

  3. protect from organic erosion of success that is caused by a lack of sustained focus on the levers in the simple well working systems, but the two above equations do not perform within a bidirectional and continuous feedback loop.

Armed with those three solving equations, I ran some initial paper tests and discovered the remaining Problem hypothesis:

4. Creating equations that solve for the first three problems [of “Why”, “What”, and sustainable protection] still doesn’t make any difference in deriving sustainable value for customers, staff, investors, or founders.

So, another round of Problem interrogation was able to reduce the complexity of Sinek’s “What” down to a few layers of complex well working systems to finally arrive at their simple well working systems. The outcome was astonishing, and refreshingly simple to explain. To make as simple as possible, I further codified the overall set of systems against Sinek’s Why, How, What.

Conclusion

Holistically, Simon Sinek’s codification is brilliant, but it is nothing close to simple. It is a well working complex system made up of several tiers of complex well working systems, all inevitably built up from simple well working systems.

I reached an equation for solving hypotheses #1, #2, and #3, using Knowledge Graphs and Logic-Tunneling, with the future potential to automate using AI and both Supervised and Unsupervised Neural Networks combined to act as the world’s best executive coach as a truly objective executive assistant able to process billions of possible combinations of biases, inputs, levers, and outcomes. I’ll decide whether or not to disclose a little bit more about this as I build a few more iterations of it.

The equation for #4, the “How” is the overall process of building a successful business and implementing the simplest operating procedures to deliver the “Why+What combination”.

What I discovered is that there is literally nothing new here, every simple well working system that I used is very well documented in books. What I did was read a lot of those books, take courses from some of the greatest minds in their areas, phone/email/visit with many industry leaders and found either the necessary problems that each simple system was defining, or relate all their stories across the different narratives.

Then I’ve quantified and qualified where they each fit, or don’t, in the overall levers of value. With that understanding, it became clear that there is a component of Why, How, and What layered below each of Sinek’s.

In the below diagram, I depict the necessary flow, which follows an extremely simple narrative:

  1. there are no shortcuts

In walking through the diagram, you must follow from top left down and avoid all red-dotted lines, which exemplify common shortcuts that I found various failed businesses have taken.

  • Everything is a compounding factor, so mush be measurable and monitored and reacted to.

  • Founders:

    • Wants: Founders have to want to self-actuate.

  • The Why:

    • Experience: The founders’ experience in creativity and leadership (not in the problem domain) need to be qualified for both improvement and maintainability.

    • Perspective: The founders’ need to have humble perspective that gains from their experience which drives the source of their passion.

    • Passion: The founders’ passion, the root of the why, the inspirational combination of the above.

    • Why this why?: the combination of the above components are critical, else it’s a hobby and the self-actuation will collapse or be ill-directed in future.

    • Timing: even with a perfect why, if the timing isn’t right, the project will achieve failure

    • Situation: the situation of all of the prior components and reality must align

    • How this how? the combination of all the previous components must be a positive yes

    • Trigger: even with the why and how of the founders’ self-actuation, there needs to be a trigger to cause it all to align into a completed “why”

    • Exactly what is this Why!: the actual visionary and meaningful battle cry based on at least some form of all the components. If a founder can’t answer the clear details behind his why, it is not sustainable.

  • The What:

    • Why this what?: is a combination of a general hypothesis that ties Thy Why to all levers, metrics, and solutions.

    • Problem Science: output as equations, SWOT, levers, personas, assumptions

    • Metrics - Continuous Improvement: the metrics discovered here, measure the components within the equation

    • How this What?: without the detailed analysis, resulting equations, and metrics to continuously monitor and improve, this what will fail

    • Solution Science: output as prioritized equations of change agents that result in value directly corresponding to the equations and metrics of the previous step

    • Exactly what is this What!: the actual product/service to be delivered, to whom, and to what value, measured by which metrics, with what risks

  • The How:

    • Only after a clearly articulated What is completed, that is based on a clearly articulated Why, can you begin on the how

    • Strategy: how to realize The What and The Why, including defensibility options, GTM, positioning, pricing

    • Advantages: the advantages the founders, partners, situation, etc have or can create in achieving and defending The What and The Why

    • Why this how: is a culmination of all all preceded components filtered through strategy and advantage development; many folks actually being to fail by starting at this step feeling that their technical or business expertise is an advantage worth carrying backwards to The What and the Why.

    • Simplicity Science: now armed with the beginnings of a viable business, tear it back down to “As simple as possible, but no simpler - Einstein”. I’ll document out this process like the others shortly, but simply, it is taking the “Why this how” and removing absolutely everything that isn’t absolutely necessary to deliver a Minimum Lovable Product.

    • Business Model: whip out your business model canvas and only now begin the process of choosing the key attributes, costs, channels, etc, else you’ll fall into the trap of using your business model canvas as a solution and trying to make every step prior to this one fit into it, and spur failure.

    • JourneyMap with Continuous Improvement: at this point you need to document every step of every participant, customer, vendor, metric, and process involved and carrying The Why to the markets, AARRR’ing your customers, and feeding your continuous feedback loop

    • How this how?: without the simple well working systems and their combinations being now complete and ready to actuate and be measured, you are able to move to the final stage of business science

    • Value Science: the final step, and the one that trips up most established enterprises from valuable change, is only now taking the culmination of all the preparation and clearly defining the internal operational processes to onboard, deliver, support, and monitor your business. At first this can be fairly simple, but before you can grow beyond a few customers and staff, you must articulate how every process receives, works, and outputs the tenants of Your Why, Your What, and Your How

    • Exactly what is this How!: having combined all previous systems, only at this point would I consider a business worth considering viable.

This may seem overwhelming, but most of these simple well working systems are either already common, absolutely necessary, and/or just polished definitions of what can be measured. Ultimately, if you think of a time that you had to write corporate and cascading SMART Goals, OKRs, and/or Strategy Maps, you’ll quickly realize that any struggle that you had in those tasks, actuating change, and measuring the success was a direct and quantifiable result of one of these simple well working systems missing from the organization.

The will to win is not nearly as important as the will to prepare to win.
— unknown

An interesting discovery that I found along this journey was that you should never create your How without first clearly defining the What with equations of both the Problem and Solution sciences. But in depicting the end result, Simon Sinek’s diagram still holds true.

In other words “People don’t buy what you do, they buy why you do it.” is still true, but “Success is driven by what you do, sustained by how you do it.”

 

… more to come… please comment and help me test this.

 

  1. Seeing green: Mere exposure to money triggers a business decision frame and unethical outcomes http://fm.cnbc.com/applications/cnbc.com/resources/editorialfiles/2013/06/12/Kouchaki%20et%20al%20%20OBHDP%202013.pdf

Start-up Inspired Quote

"The fact that an opinion has been widely held is no evidence whatever that it is not utterly absurd; indeed in view of the silliness of the majority of mankind, a widespread belief is more likely to be foolish than sensible." - Bertrand Russell

Canadian, Municipal FTTH Realities

I am just home from an excellent technical summit put on by @Cybera.  It was the @CyberSummit14.

At this summit, I gave a talk on "FTTH as Municipal IaaS".  I started my session reliving the success and failures in the Canadian FTTH market.  Adding the failures part made the session way too long, but it set the light on the reality of Canadian FTTH; it is very complicated, fraught with active enemies, and has poor results when outside of a bigger-picture initiative.

In general, to summarize in one sentence, is that citizens don't generally volunteer to be customers of government, and municipalities shouldn't be involved in the retail ISP/TV/Phone provider industry.

The reasons are many and varied in their depth, and I'd be glad to elaborate on-demand for anyone, but fundamentally, the initiative of Municipal FTTH will likely only succeed when part of a much larger initiative.  And the larger the capital costs, the more risk, and thus the more mitigation, and rightly so the more options of value-chains are required.

Such as, if the municipality is larger than 400K dwellings, its value-chains may look as follows:

  • they likely have 3000 - 5000 things that could be points-of-interface within their corporate boundaries that will at some point in the very near future need a high-speed data connection.
  • they're likely to have 5000 - 15000 municipal partners (schools, police, fire, airports) that would immediately see value in connecting to a municipality-wide high-speed network.
  • there is likely 50k or so SMB's and enterprises that could find ways to be more profitable within the community by being better served and more connected throughout the community.
  • and the 400K dwellings would all be near one of the points above and would likely want competition in the market for both innovation and opportunity in the digital economy, and that the muni would like to exercise their diminishing opportunity to reach those dwellings in the existing, but finite, right-of-ways.

It makes sense to mitigate the risk of the wants with the needs, and cost of the needs with the wants, and to roll out a city-wide solution that empowers all levels of public interest.  But it doesn't make sense for that municipality to venture into the private-industry layer of retail services to those businesses and homes, or possibly even those civic partners.

The conflict-of-interest is in the purpose of a municipality and the purpose of retail services, which, of the former I believe is fundamentally and necessarily open within its governance, and of the latter, exclusionary when looked at on the boundaries of its marketing.

A municipality's purpose is not focused on a specific target-market within its jurisdiction, it is about all of the citizens, all of the time, and validation is predicated on the perception of opportunity to lifestyle afforded, to the citizens, by the political and administrative platform in office.

A for-profit business, in the best possible situation, is focused on their customers' happiness, which is validated with profit.

The dichotomy is that the governance, sustainability, and capital cycles operate on and in polar-opposite paradigms. Mixing the two will never be a solution, but rather a push-pull relationship with completely different market drivers; one being political pressures, the other private-market pressures.  In a worst-case scenario, it will be a lobbying and unfair advantage relationship that can only end in scandal.

Most people don’t volunteer to pay the gov’t for anything...
— #CyberSummit14 @LanceGDouglas

Is there really a place for Municipal FTTH, if municipal-purpose is really founded on the affording of jurisdiction-wide opportunity, but not on validation by profit?  And if there is, is there a way for that municipality to avoid the conflicting interests of the complex layers of execution required in the full-stack of a FTTH network, operations, and retail services?

YES, and yes.

The larger municipalities would be well served to participate on the layer that it knows best, which is infrastructure.  I say "knows best" in context to the complexity of services delivered within the most retail-customer-facing side.  It is similar to the complexity of the services and upshots of roads, but dissimilar to the relative simplicity of electricity and water upshots at the retail demarcation point.

That FTTH infrastructure layer, which would provide opportunity across all four municipal value-chains mentioned above, is also mitigated by the full deployment having many more possibilities of value realization, than proportionate deployment based on priorities or demands of only one or two value-chains.  A full commitment to a full FTTeverything deployment is a set-it-and-forget-it for 40-60 years solution.  Not one that is only an immediate bandage, nor one that is overkill, but rather a finite policy on connectedness being as basic a citizen need as clean water, sewer, garbage pickup, roads, electricity, and bylaws.

In my opinion, the Municipal FTTH space must demarcate as the fibre asset in the ground up to the premises edge termination point.  This layer boundary is referred to as the NetCo.  There is a possibility to extend the NetCo mandate through the other layers (OpCo and NetCo) with a soft demarcation point within each premises, providing basic internet at the minimum standards set federally within its definition of "highspeed" with zero bells and whistles.  The latter half is not a contradiction, it is a base-level service for a jurisdiction-wide option of opportunity, but also a possible key change-agent.  Contact Lightcore Group for complete details on the deployment and risks analysis in your municipality, today.

Beyond the NetCo is the network operator that manages and enforces policy on the network to the Retail Service Providers (RSPs).  There is no consideration for net-neutrality here other than that non-neutral is a business model, and I fundamentally believe that private-industry business-models should not be regulated, but rather that if there is fundamental belief in net-neutrality business-models than government may need to look at ways of incentivizing private-industry to grow in that direction, organically or naturally.  The same way that solar, wind, and bio-fuel energy-sources required heavy subsidization (development capital, grants, and otherwise) to become available in the market as an option, so too might net-neutrality-based business-models need subsidizing, not unfair advantage, upfront.

There needs to be clear delineation between the one or more RSP's and the NetCo, and that is the role of the OpCo(s).  The OpCo(s) is a gateway and gatekeeper for RSP's to on-board and compete openly, without the ability of the RSP to influence the municipality with direct-lobbying channels, unfair contracting-mishaps, or unfavourable connection between "government program" and "retail services".  The OpCo(s) has been suggested by some to be a good fit for a non-profit.  I fundamentally disagree.  Simply because the purpose of the OpCo(s) is to entice as much competition as possible without the politics, turn-over, or personal-passions getting in the way.

In an ideal world, the OpCo(s) would be robots, or fixed-fee 5-year tendered contracts for up to [the maximum connections to each single premises, minus one] wholesale management companies (which of course would be a mix of interest groups, incumbent subsidiaries, and market entrants), but a mix of interest options would make sense.

On the Retail Service Provider layer, it is very complex, full of regulatory oversight, and content-owner control.  Each retail provider should be able to choose its OpCo at will, in this model, in the hopes of driving up innovation, collaboration, and not collusion between the OpCo(s).

What about the smaller communities?  Is there something that they could do even affordable?

YES, and kind-of.

In a smaller market than 100K, and looking at a new FTTH initiative, as laid out above, is not going to work very well.  Functionally, it can be deployed without issue, but that "technical feasibility" is only 20% of the whole picture.

When Lightcore Group looks at feasibility, we break it down into five distinct categories, and then build them back up inter-vetted and intertwined into a solid statement of feasibility.

In short summary, our municipal FTTH feasibility method is acronym'ed S.T.E.L.E.

  1. Social Feasibility
  2. Technical Feasibility
  3. Economic Feasibility
  4. Legal Feasibility; and
  5. Environmental Feasibility

The truth is that launching an ISP in any municipality of any size is generally going to be facing a large uphill battle against bundled services from other providers.

If you're considering an under-served or un-served market, then none of this applies to you; call Lightcore Group, we can have you up and running successfully, the quickest.

That bundling is purposeful.  It makes the existing services sticky to their provider.  We realized in the O-NET launch, rather quickly, that you at least need to bundle some sort of IPTV content with your service to entice people to consider changing their life habits.  But that is just it, you're asking people to change their universe for you.  No amount of community engagement is going to get people to make the purchase to the tune of 40%+ take-rate on day one, or year one, unless you have some other major change-agent, such as a fanatically-loved brand like Google Fiber.  Or if an incumbent plays on your network (again this is not an under-served or un-served market we're talking about), which they won't without market pressure to do so, you'd have a better chance of switching people to your FTTH infrastructure without them even really knowing via a "standard or special" network maintenance initiative to pair the two.

A small community may be able to solicit a competitive carrier to become a market entrant, but the numbers have to be there for their value to initial marketing and on-boarding costs to balance.  But regardless of RSP's, if they aren't fundamentally different in the customer-perception of their vision and mission (meaning don't sell "copper, me too" bundles and packages over FTTH, which is as helpful as "press or say one"), then the RSP's will need to wait-out existing contracts, brand development/trust, and significant losses somewhere in the overall FTTH initiative, which regardless will be borne by the citizen.  And don't be shy in your estimates of ruthless competitive pressures by existing carriers not in favour of given market share away.

My recommendation is NOT that smaller communities avoid FTTH, which some people may have mistaken me for saying tonight, but rather that their initiative has to be about a bigger picture, a longer time-frame, and ultimately a multi-municipality collaborative (meaning shared governance) effort to culminate a large enough market to entice positive industry change, and relevant local rewards of that change.  Smaller muni's need to work together to create a larger collective market that is accessible "as a service" to the remaining layers needed to generate the opportunity demanded of their jurisdictions.

We would welcome any municipality considering a FTTH initiative to give us a call to get our perspective of the realities of the industry without either the hyperbole nor FUD (fear, uncertainty, and doubt), of both camps on either side of our positioning.

Our model for smaller municipalities is based on risk mitigation the open practices and reference model operations that get a muni up and running quickly.  Our Municipal FTTH funding division, Lightcore Finance, will fund nearly any FTTH project that has been vetted bout our team of industry experts to ensure valid practices and procedures are being proposed and adhered to.

If you have a question, post it, I'll answer it.  :)

White Paper: Telecom as the Middleware Stack | Orchestration of Virtualization & Oversubscription of Core Operations

I have finally published my first white paper on the revolution of the telecom industry that has the potential to change the way we communicate, share, and exist.

Authored on: 2014/02/07
Published on: 2014/04/07
Copyright (c)2014 Lance G. Douglas.  All rights released.

Objectives

This white paper investigates the opportunity and risk-mitigation made available when migrating to a real-time, telecom-operator-wide integration of the control, delivery, and operations planes via a proven purpose-built middleware stack approach.  The proposed result being the stability of immediate- and long-term viability, protection from disruptive technologies, as well as competitive advantage over market challengers while converting existing cost-centers into new profit centers.

 

About the Author

Lance G. Douglas has been an in-the-trenches innovator in the North American telecom, government, and enterprise managed-technology space for nearly 20 years, and most recently led the technology and operations’ vision as founding CEO of Canada’s first Gigabit city service provider, O-NET.  Having worked within and for leading service providers, Lance is innately aware, with both outside-in and center-out perspectives, of the time, cost, and compounding-value challenges that service providers face when responding to market challengers, disruptors, and opportunities.  Lance’s passion is injecting optimization and flexibility into corporate-strategy through the commoditization of orchestration and virtualization of telecom operations, resulting in reduction of risk and in the compounding value and agility with each move forward.

Working at WSO2 has afforded Lance direct access to leading middleware technology that has the opportunity to offer the telecom industry a safe and phased approach for a service provider to quickly move into a future-defining position with compounding value vs. a challenger-responding, future-chasing situation with only incremental one-off or proprietary value-adds relative to enhancements.

Technology Strategy Has Nothing To Do With Technology

While researching Telecoms that are breaking out of the norms of acceptable rates of innovation, I came across Cox Communications and their impressive CTO, Kevin Hart.  He is really shredding the envelope on the industries status quo[1].

I've had personal sessions with other CTOs and executives of the large Telecoms in North America and most are trying to be innovative for internal business cases, but I find that Hart is really shifting the corporate technology strategy for the industry.

Reading about Hart, I was reminded of a principal that I've worked under for quite some time, and it has both unleashed me, as well as stumbled me.  Hart clearly hints at it in two statements attributed to him:

  • "Half of technology is probably anything but technology." [1]
  • "It is important to understand the business drivers. All too often, IT leaders do not focus enough attention on that." [2]

In my past, I was, what a 13yr reunion with a colleague reminded me of, an outright code monkey.  At first, I felt a slit of resentment, because in my mind, even back then, I had realized that while coding amazingness right out of thin air was awesome, the world doesn't revolve around technology and never will.  It revolves around living and the business of living.  But I truly respect the colleague and knew that he spoke from his heart and experience working with me; I realize that back then, my only way to express that budding understanding was through my profession as a hands-in-deep Dir. of "Technology" for a government-targeting cloud start-up based on technologies that I designed and mostly wrote.  Luckily, I haven't coded anything professionally in 10 years, other than side jobs to keep me abreast of the latests.

Recently, while considering opportunities for using my talents more effectively on a global scale, I met with some VPs of a global technology group.  The conversations were, while all interviews, mostly focused on my thoughts on specific technologies and my experience using technologies for the sake of the technology.  What I found most interesting, is that they, seemingly, were not at all interested in what was made possible - what markets were empowered, what lives where changed and how, what costly assumption was eradicated, what internal operational efficiencies were achieved in tandem - with the technologies under my belt in relation to their current and possible businesses, and they didn't seem to associate technology-agility solely as an aspect of business innovation.  Ugh.  In the end, in post conversations, the concept of "lite on the technology side" swung the door closed for me; and I'm happy to be a technology-consultant for them hired for a business case to help innovate their technologies versus a technology-leadership employee hired to shape their technology vision (an uphill climb that had too many entrenched obstacles to be fun).  And here is why I think Technology Strategy Has Nothing To Do With Technology:

Technology is nothing more than a tool.

Just like a hammer, or a scanning tunnelling microscope, or an adaptive-overlay communications network, without something to build, see, or communicate, respectively, the tools don't have a purpose in life or business.  Now, of course, there are organizations in the business of building better tools, and in some cases, even creating new industries or life use-cases right out of thin air that is in the market gap that their new tool bridges.  But, it all still ends up as a business or life case driving the value and implementation of the technology as a vehicle to a tangible and/or intangible result not possible of the tool alone, but entirely possible, almost always in some way, without the tool at all.  And, in the situations where the technology birthed the use cases, the technology does not continue to control the existence of the resulting use cases, simply because the technology was a trigger to possible as a stepping stone, not the actual existence of possible.

Technology Strategy is 100% about Empowering Results in Business and Life.

For many years, I've learned to focus on usability, or experience, needs and how technology can fill those needs, or bridge the gap to other means of filling.  When a technology company focuses too much on the needs "of the technology" in a business, they will struggle with both internal and sales operations that wont be focused on the only thing that matters: who is going to use my technology to make their world amazing and how can I get my technology in front of them in the right context to shine the possibilities that likely only they can understand up-front.

The ones who buy/use technology either have an idea of where they are try to get in a road-map with it, or they are inspired by the technology to create a new idea with it.

 

Comment below, I'm always interested in hearing others' opinions.


Telecom as the Middleware Stack: Orchestration of Virtualization & Oversubscription of Core Operations

This is my first post in a series that will detail my progress in implementing an end-to-end open-source configuration-defined Telecom Operator.  The Operator will be designed to be all the features and functions, both internally and externally, in a unified and completely abstracted service model.  I named the result the "Telecom as the Middleware Stack" as a form of expressing that it becomes the service it provides and is reduced to a consumer of everything and anything that it can communicate as an adaptive version of one overall amazing experience.

Imagine a tablet that, depending on who's holding it, adapts its purpose.  In one person's hand it's an enterprise's CRM experience; in someone else's hand its an IPTV set-top-box; another's: a portal to live cameras; and another's: the live network map of fiber and copper in the ground within 10 feet of where their standing.  Imagine that tablet is simply an interface that can be anything, to anyone, simply based on authentication, authorization, policy, and governance; a completely adaptive experience delivered all by a single device.

We can imagine that quite easily because the tablet has evolved to be just that: an adaptive blank slate.  Now image that same experience with every digital service, every electronic device, in every conceivable layer of service delivery, and for every desired experience that a Telecom Operator could provide, all from a single Service Provider: that is the promise of the integration of NFV, SDN, IoT, and the Telecom as the Middleware Stack approach.  That is what I want to outline and explore in this series.

The goal of the journey of this series is to:

  1. Reduce Telecom Operator costs associated with closed, legacy, and/or proprietary IT.
  2. Increase productivity through experience architecting the multi-tenant, multi-purpose usability.
  3. Entrench technology-agility, real-time intelligence, and market-viability.
  4. Mitigate risk of vendor, technology, or strategy lock-in.
  5. "Open" every door for innovation from internal and external perspectives.

The strategy to obtain that goal is to:

  1. Develop and expose the entire TMForum process-standards, application-frameworks, and API reference-models into centralized Governance Model, Multi-Tenant/Purpose Policy Engine, and API Manager.
  2. Abstract and integrate the wiring-harnesses of network function virutalization (NFV), software defined networking (SDN), and internet-of-things (IoT).
  3. Utilize WSO2's unified product stack to implement as much of the integration, governance, policy enforcement, complex event processing, and user experience as possible, but no more than that.
  4. Decoupling of systems from user experience and marrying them back adaptively via policy.
  5. Introduce "open-source" principals to the experience architecting aspect.

The intended result is:

  1. One unified strategy, a single elastic implementation, redefined as an infinite number of adaptable marketable experiences: "Orchestration of Virtualization & Oversubscription of Core Operations"
Figure 1 - Telecom as the Middleware Stack | Examples of one implementation oversubscribed as many different experiences to many different communities of users.

Figure 1 - Telecom as the Middleware Stack | Examples of one implementation oversubscribed as many different experiences to many different communities of users.

I've written a white-paper with the exact same title, which is due to be published in the next month or so (once reviewed by my WSO2 colleagues), but below is an exert:

Executive Summary

A telecom that migrates to an approach of operating as a middleware stack is a move that empowers its core operations to truly introduce agility, stability, and profit-center generation that brings together the industry advancements in elasticity and virtualization while providing the necessary business and marketing integration that empower the required orchestration for success.

The world of connected-business is being monetized by telecoms via their cloud-based (multi-tenant virtualized) service offerings. Interestingly, the most effective means of agility and leadership in this space is to first be a 100% connected-business internally and then to resell directly that internal-integration as externally-exposed offerings, which drives down costs and increases market-responsiveness.

If oversubscription is the backbone of telecom revenue, then reducing or removing the costs of sales and innovation from customer acquisition, market pivots, and differentiation-growth is the most stable way to protect and expand on the existing thin margins.

Feel free to comment along the way and feel free to pitch your ideas and efforts in.


General Internet Safety Recommendations

A few friends have been writing me and asking for advice on how to keep their computers safe on the internet.  Some have written, and didn't even know they did (wink, wink), usually under the guise of a link that "I just gotta follow...". While there is no real safety on the internet, below is the list that I at least try to stick to in my house.

  • Rule 1: The only truly safe computer is the one that won't ever exist.
  • Rule 2: Everyone on the Internet wants to exploit you in some way, everyone.
  • Rule 3: Every computer must have anti-virus software installed.  Free is available, but a paid version is recommended simply because by being a revenue stream for the AVS company, they'll make the effort to make sure you don't forget to continue being a paid subscriber (so exploit their business model).  I prefer the ease of Avira.
  • Rule 4: Never follow unsolicited links in emails, even if you trust the source.  Consider links the big red button that when pushed will destroy the world (although that would make computers safe.).  If you solicited the email AND link, then copy&paste the link into your web-broswer instead, if you can.  Just so you are aware, a link can "say" anything its author wants, so it "may" look like a valid address, but it can take you anywhere the author wants when clicked.  For example, the following link actually takes you to an RCMP website but looks like it takes you to the Royal Bank http://www.rbcroyalbank.com.  By copying the link, you'll copy the "text", not the underlying "link destination".  But even this isn't always safe, because anyone can have a domain name and make it look safe such as: http://rbcroyalbank.trust.me when copied and pasted would actually take you to a trust.me controlled website, which could be something malicious.
  • Rule 5: Just because a website can "look" legitimate, it can be fake, and really it just wants you to enter in your login information so they can steal your life from the real website.
  • Rule 6: No email that you get telling you that "I can't believe what they're saying about you..." should be followed up by following the link provided.  Follow up with a phone call.  And don't reply to the email.
  • Rule 7: Have a separate email address for website accounts and use another one for friends and families.
  • Rule 8: Don't be lazy about your security, the people that want to exploit you (see rule 2 to find out just who that is) are always just a little less lazy than you.
  • Rule 9: Never ever give anyone your passwords to anything.  No legitimate website or friend will ask you for your password, or to remotely control your computer unsolicited.  Consider them like your Will, it will be exploited if you share it.
  • Rule 10: Keep separate passwords for work and play, but one no less safe than the other.
  • Rule 11: Stay off of immoral websites; the risk of being exploited is 4000% more likely (yes, I made that number up, and I think it is likely more conservative than exaggerated).
  • Rule 12: Google images is not necessarily a safe place to search.  The pretty pictures of flower gardens may actually link to a malicious website that when visited, attempts to infect unprotected computers (I have see this happen to the safest people with a non-updated AVS running on their computer and using an old browser).
  • Rule 13: Update your web browser.  There is no "great" browser, Chrome is fast but owned by Google, Firefox is good for techies but getting quite bloated, IE is just as good as any other but the obvious target of hackers simply by numbers.

The Business of Looking Good: Why vs. What

Interesting cross-thought here.  After contemplating Simon Sinek's brilliant "Start with why" speech, here, the thoughts eventually collided with previous thoughts-entertained revolving around the way the pretty people have an advantage in a visual environment; well spoken in audible; etc...  I wonder how deep that "why" goes.  Are the barriers, to trust/love/acceptance, fuelled by knee-jerk reactions to senses of reasons "why" someone may not be trustworthy/lovable/acceptable... If the beautiful are innately more available to success, do they innately, seem to, have less reason to be disbelieved? Just post-in-noting that for further thought.

Cornerstone Thinking

I've had, in the past few years, had several respectable, and respected, people share their ideas with me.  And thinking on how to articulate this next line I just realized that quite often those ideas, regardless of how the ideas are framed when shared, are in need of a personality that will help anchor the idea to "what" will get that idea from, what I like to say is, "TO-BE" to "AS-IS".  I like to think that, as those that share their ideas with me are seeking out that "what" and its source, they feel that they may see a glimpse of something in me that reflects what they think they need. Further to that point, I am, but shouldn't be, amazed at how folks, including myself at times, are capable of being so convinced of, or possibly better stated as being thoroughly embedded within, the vision of an idea, that they consider the idea to already be real and in an "AS-IS" tangible status.  To counteract this detrimental leap from reality -- in those potential cases I remind myself that -- we must exercise our ability to objectively push our idea through a gating process that reveals the truth and route to viability.

We've all done it; seen it; wore-out the t-shirt; a great idea leads to momentum, which if not removed from the idea machine and into the reality funnel aimed at the creation machine; the idea becomes bloated, convoluted with tangents, and takes years of effort before it either dies under piles of wasted money, or we/they realize that the idea must actually go through the correct gating process.  In other words, trying to move from vision to creation without proper anchoring to reality, leaves the same result every single time; drawn out failure (whether outright our simply due to marginalized success, failure nonetheless).

In the past couple decades, I've been involved with many successful and failed ideas, from many different positions on the totem pole.  I've learned a great deal in every one of those failures and reinforced those lessons in every success.  It all culminates in to a single concept; what I'm terming Cornerstone Thinking in this chapter.

The concept, as many purport proportionally in many other frameworks, is to have a standardized tunnel/container/box, which ideas enter into and are maintained within, during the reality and creation gating processes.  The sole purpose, of this controlled environment, is protecting the mindset around the processing of the idea into something tangible.  The reason this controlled environment is required is due to the common confusion that the necessary near-chaotic environment of idea generation may in any way permeate, support, or provide value to the creation process.  This confusion has been further demonstrated by the common reality experienced by iterative development efforts; the unwitting, or disillusioned, participants fail to realize how much more structured and controlled that environment must be to result in a successful outcome, at a rate comparable to traditional waterfall development.

This chapter will layout the framework for anchoring an idea into a protective environment, which will allow that idea to properly maintain its position of "cornerstone" in the building out of the necessary constants and variables required to carry the idea through to fruition.  The terms "reality" and "creation" gating processes are abstracted, and necessarily meant only to be demonstrable of the evolutionary process in phasing an idea through a validation phase into a best-prepared realization phase.

Looking for Evidence

The threat and weakness to the seeking out of evidence is that quite often, when found, evidence is susceptible to bias and/or lack of understanding of how that knowledge is influenced, or influences, the bigger picture. Evidence must be evident, both from upstream, downstream, and horizontal perspectives.

Wisdom demonstrates that evidence is knowledge, which leads to understanding, which is unbiased and context-specific first.  This fact is invariably realized within all intelligence gathering, origination, and organization.

So instead of becoming lost in the biased speculation of "Looking for Evidence" and believing that knowledge is understanding, you must work within a framework for the origination and organization of humble knowledge.  Just like when building a puzzle, find the picture first, not just hints of speculation.  When the picture doesn't exist, realize that every piece of knowledge is speculation until the understanding of the picture can be realized.

What does humility have to do with it?  Genius is understanding that you cannot know everything; and wisdom is experience of threshold(s) needed to achieve valid perspective from what you do know.  (Rings as: fools speak early but say less, the wise speak less and say more).

Intelligence Design: The Language of Success

Over the years of technology analysis, design, and development efforts that I've been apart of, I've noticed a common thread: "I think, therefore I can".  Sounds strange and simple, or just strangely simple, depending on your bent.  But we can all relate, which I'll hopefully demonstrate here along with my solution that doesn't even require a scalpel. We have all heard, or said, one time or another, in our heads, or out-loud, verbatim or likewise, why isn't commonsense more common.  What is scary, a realization that I believe that I've come to, is that there is a direct inverse scale of the critical-nature of the need for commonsense against the presense of commonsense.  In other words, if I smitty'ed that accurately, the more critical a situation for the need of commonsense, the less likely commonsense is to be prevalent amongst the actors, within that scenario's anticipated critical source, of that same said commonsense.

That translates into a thought-entertained that possibly commonsense does poorly under pressure, no?  Maybe.  But what do I notice is that commonsense is, at least, assumed to a greater extent, of those put under pressure, in such situations.

I've found that the analyzing and recognizing of my own SWOT to decision-making, -baking, and -taking has allowed me to effectively run my "all of the above" through a mental-mapping-process that reveals my best-chance for a positive evaluation of "what makes sense".  That is what makes me capable of very complex thought-maps of projects, visions, opinions, and/or concepts from various perspectives against one-to-many linear points of variance.

So, the fact that I say this process exists is fun.  But over the years, I've been working in many situations where, what I have assumed was, logic did not prevail.  In retrospective analysis, against the full gamut of blame, I learned, long ago, that the explanation of an idea is actually more important than the value of the idea, whenever the communication of the idea is required for the breath-of-life to be granted to the idea.  Thus, since the stakeholders, influencers, and/or support-structures of that breath-of life are both numerous and of varied perspectives in each situation, as all can imagine, the explanation of an idea is not a singular faceted proposition.  One, for to be successful in all the various iterations and variations and incantations of participants' perceptions, must develop a structured mental capacity for the origination, organization, and presentation of complex ideas.

In my head, that structure seems to have existed prior to my understanding of its need, as well as my comprehension of its depths.  It has taken many trees and I/O to have my brain and my understanding arrive at the same place, at the same time.  Synonymous to the value of breath existing way before the understanding of its need or its power to change the world in a word.

However, just like the power of the word, the need and power of intelligence design must be matured greatly, and against many perspectives, prior to wielding.  Moreover, possibly most importantly, the cornerstone of its value is humility; the language of success has dialects and evolutions, and we can never "know it all", we can only simply know there is always more to know than our own "all".

I am not only here to just succeed, I am here to be success.  What follows in this section is the synopsis of my upcoming book: Intelligence Design: The Language of Success.