Showing posts with label strategy. Show all posts
Showing posts with label strategy. Show all posts

Friday, February 6, 2015

Virtualization: There’s got to be more!

Virtualization: There’s got to be more!  

Porting to Intel and Virtual Machines is a technical implementation detail not a business solution.  

When vendors are asked about their virtualization strategy you often hear a common answer.  They say they’re “virtual” since they ported their software to Intel.   What’s the value proposition?  Porting to Intel isn’t it. Sure it reduces CAPEX, at the expense of performance.   All it really does is shift industry revenue, power and influence from the Broadcoms of the world to Intel.  Plus, we now know that if CAPEX goes to ZERO less than 33% of CxO’s top of mind business problems are solved.  When pressed for the rest of their virtualization strategy they say the run on virtual machines in a data center.  OK, and then what? 

Let’s assume they perform three functions called A, B and C.  They port them to Intel and then run them on virtual machines (VMs).    Shifting revenue from Broadcom to Intel is a technical implementation detail and not a business solution.  Service providers should be thinking there’s got to be more



The logical question to ask is whether A, B and C are the right functions in the virtual world.  Just because they were required in yesterday’s environment does not mean they are required in the virtual world.   Do you really need 20% of A and 60% of B?  Do you really need 150% of C?  You get the picture.
Service providers will be spending billions of dollars moving to the virtual world they should be asking themselves, why?   Sure there’s a benefit to take the A’s, B’s and C’s of today’s world to virtual machines.  But is it really enough?  This is a one in a lifetime transformation and a fight for ultimate survival.  SP’s need to ask for more. 


Vendors on the other hand need to be asking themselves similar questions.  Is porting to Intel enough?  What can we do that’s game changing in the virtual world?  The answer, IMHO, to the first question is No Way.  The answer to the second question depends on the vendor’s core competencies, ecosystem presence, business strategy et al.    The good news is that ACG Research can help you answer this second question.  Give us a shout.  



Friday, October 24, 2014

Cloud (Network) PVR Discussion



After a delay due to legal review, which was settled favorably to the SPs’ network, DVR/PVRs (N-PVR) are being deployed in earnest by all SPs. N-PVRs are becoming mainstream: In addition to improving the consumers’ experiences they also offer numerous business values to the SP. N-PVRs are also a great step toward virtualization and the move to everything on demand. Simply put, an N-PVR is a DVR that resides in the cloud. When a viewer stores a program, instead of it being stored on a hard disk drive located inside the set-top box, the program (or metadata) is stored on a server or CDN cache located in the service provider’s network.
N-PVRs benefit both the service provider and the consumer. Service providers benefit substantially on both CAPEX and OPEX. Service providers have a love-hate relationship with the set-top box. They love them because they provide a managed service enablement platform in each home, yet they hate them because they account for about 50% of total CAPEX. Interesting to note is providers argue they are part of the network when it is convenient and argue that they are CPE when it is not. The latter is because of regulator ambiguity of TITLE VI in the U.S.

By eliminating the hard drive from every set-top the cost of the box is reduced, directly impacting CAPEX. Hard drives, being mechanical devices, will fail. The elimination of the drive thus increases the reliability of the box and reduces angry customer support calls and truck rolls. This directly impacts OPEX. Without a hard drive the set-top box will consume less energy, supporting the SPs’ goal of meeting the voluntary energy reduction agreement the industry and the U.S. Department of Energy (Note, not the FCC) signed in early 2014.

N-PVRs store consumer “save” programming in the cloud. This simplifies whole home DVR and video everywhere service offerings. With N-PVR all consumer playback originates in the network and not the primary set-top box. Although newer homes have coax cable widely installed throughout the home the bulk of homes do not. All in-home technology deployments are challenging to the SP because of the high variability in both the housing stock and in consumer sophistication. With the N-PVR all video streams are delivered from the network as “just another” channel.

N-PVRs benefit the move to TV everywhere or TV to all devices. In the home consumers watch TV programming on all of their devices. Because Wi-Fi is the common fabric connecting every device, N-PVR based video programming can be sent via the broadband connection and over the Wi-Fi network to all devices. Thus, consumers can view stored programming on all their devices, and the SP does not have to contend with home networking issues.
The N-PVR takes this concept out of the home as well. Consumers can view stored program from anywhere on any device with a broadband connect. This also applies to the delivery of programming on smart phones via 4G/LTE connections. In each of these cases, the stored SP programming is treated as over-the-top (OTT) and facing the same challenges pure OTT suppliers face such as quality of service and data usages. Equally, they can benefit from the innovation in the OTT marketplace.

This everywhere DVR experience also presents a new revenue opportunity to the SP. Targeted advertising and local ad insertion take on new meaning. For example, if a Boston-based consumer is watching a stored program from San Francisco, why show the ad Boston-based car dealership? Similarly, with smart phone location-based services being widely deployed, the SP can insert a targeted ad based on the user’s real-time location.


As illustrated, N-PVRs not only offer consumers a better video experience, they offer the SP real business value: reduction of both CAPEX and OPEX and creation of new revenue generating opportunities. Properly deployed, the N-PVR infrastructure will create the foundation for everything on demand and the move to virtual set-top boxes. Given these factors, SPs should deploy N-PVRs aggressively with the caveat that they must take a long-term strategic approach of viewing N-PVRs as the first application of the platform and not the only application. 

To discuss this please contact me at gwhelan@acgresearch.net

Tuesday, June 10, 2014

IoT Success: Batteries & Backhaul...& Transparency

The Internet of Things (IoT) is riding high at the peak of the Hype Cycle.  Is it a $17 Trillion market or merely a $10 Trillion market?  Depends on what you include in your definition of a "thing".  The more you include the bigger the market.  

IoT applications have a basic common architecture as shown in Figure 1. IoT digitizes some analog parameter and sends it to the "cloud" for analysis and possible action.   The primary factors that all IoT or M2M applications must address are Batteries, Backhaul and Transparency.  
 

Let's address the transparency issue first.  In quantum physics there's the "uncertainty principle".  Simply put is says that whenever you measure a system you disturb it.  Since most IoT applications measure a real world analog phenomena (e.g., temperature, pressures, et al) the "thing" must do so with minimal impact on the system you are measuring.   Transparency parameters include cost (CAPEX and OPEX), size, weight, aesthetics etc. 

Batteries, or more generally power, is a critical parameter within IoT applications.  If you require grid power you lose some transparency and limit your ability to deploy the thing.  Not every location will be close enough to the grid to be able to be powered by it.  Remote sensors will require batteries.  These batteries must last many months and even many years.  

Even IoT applications within the home must address the battery issue.  Take a simple motion detector.  The ideal placement is in the corner of the room near the ceiling.  Not many power outlets near by.  Thus the customer can either install an outlet close by, move the sensor close to the outlet (i.e., near the floor) or have to see the wire dropping down to the nearest outlet.  Batteries solve this problem.  However, if they need to be replaced every month the value and transparency quickly depreciates.  What if this motion detector is part of a security perimeter for a high value asset (e.g., power plant).  If the good guys need to replace the batteries periodically it will show the bad guys where these "hidden" sensors are.  Thus, batteries are critical to the success of the application and can make or break a business case.  

The "I" in "IoT" is for "Internet", meaning internet protocols (IP).  The digital data of the analog phenomena must be sent to the cloud via some type of network. This is referred to as Backhaul.  Networking options are plentiful  and include 2G/3G/4G/LTE, Wi-Fi, Zig-Bee, Satellite, Blue Tooth, Ethernet and local broadband options.  The technology selected depends on the application and on parameters such as data rates, latency, cost and what's available.  The more remote the thing is the less options are likely available.  The selection of a backhaul solution must address both transparency and battery issues discussed above.  

There are other issues and parameters that need to be address to make an IoT application successful.  For example, the cloud solution (e.g., "big data" base, analytics, heuristics, et al) are not trivial yet they are solvable engineering problems.  The same is true for the "thing" or sensor. For most all you have to do is go to the Analog Device catalog and select a chip.  Again, non trivial but solvable.  Thus, batteries and backhaul and transparency are critical make-or-break parameters to ensure success of your IoT application.


To discuss this please email me at gwhelan@greywale.com

Other articles can be found at greywale.com




Wednesday, June 4, 2014

The Next Cord Cutting: Real Cord Cutting


Today "cord cutting" refers to consumers who stop paying for TV and go broadband only from the cable or telecom company.  This is more accurately called "cord shaving".  The economic impact is significant but it's more of a redistribution.  More money to "Netflix" and less to the service provider for video.  Yet, more to the latter for higher capacity broadband that provides higher margins.

Tomorrow's "Real Cord Cutting" refers to consumers who completely stop all services from a wired service provider.  The go completely wireless.  We've seen the prequel with the elimination of a "home phone".  This next generation cord cutting has consumers relying on their 4G/LTE service for all broadband services.  This can be accomplished by simply turning their smart phone into a Wi-Fi access point when in their home.  The economic impacts of next generation real cord cutting are severe. The fixed access service provides not only lose all service revenues they lose customers entirely.

As 4G/LTE deployments expand and as more small cells get deployed the average bandwidth per device will increase substantially.  When the Netflix threshold (e.g., when the quality of streaming video is acceptable) is only a matter of time.  Slowing down real cord cutting will be the price of mobile data plans which will eliminate the intended savings in the first place.   Service providers with wireless assets will be in a strong position to succeed in this future scenario.  Other, such as cable MSOs will need to address their pricing plans which are driving customers away in the first place.  They can also compete with unique content, primarily "Sports and Wars", (i,e., live programming) and push for better quality video such as emerging 4K technologies.

Today's cord cutting is growing significantly especially in the under 30 demographic.  Tomorrow's real cord cutting will occur and will have substantial economic disruption for the entire ecosystem.


For past article please visit greywale.com 


Wednesday, May 21, 2014

Open Internet + Fast Lane: Win for Consumers: Yet Trust but Verify


The recent move by the FCC is a win for Consumers.  Yet, it's important the FCC "Trust but Verify".

Let's look at who will be the winners with the new FCC rules.  The Consumers.  Consumers will be the ultimate winners.  First, the ISPs will get a fair return on their capital investments and will have the incentive to invest in more bandwidth. Which enables wave after wave of innovation.   Second, those companies that pay for the fast lane will be those that consumers want and have a willingness to pay for.  Netflix for example.

Keep in mind that the FCC proposal STILL prohibits the ISP for degrading traffic.  Thus, those consumer applications in high demand get preferred treatment for the last 50 miles of the network (CDN Cache to home) and all other traffic gets treated the same it's always been.

The argument that small start up companies will be disadvantages is hollow.  It will force entrepreneurs to innovate more to deliver a compelling product to consumers.  It's just another market force to overcome.  It will raise the bar and eliminate the marginal applications from clogging the network.   This is no different than supermarket shelf space, a large barrier to entry.  Coke and Pepsi dominate.  Yet, look at all the upstart beverage companies that keep gaining shelf space.  They're doing this by creating innovative products that consumers want.  Not by whining to some federal regulator.

Why do industry pundits complain when Verizon, AT&T, Comcast, et al get  fair return on their investment and look the other way when Google, Amazon, et al make $1000's per second? 

Therefore, Consumers are the big winner here.  A) More bandwidth B) More innovation and C) Less marginal applications.

Given that the FCC is charted, via the Congress, to protect consumers this new Fast Lane approach is a step in the right direct.  However, the service providers must be careful not to over use this opportunity.  Hence, the "Trust But Verify" mantra.

Telco's and Cableco's must know that the FCC will be closely monitoring this new ruling (assuming it gets implemented).  They must adopt a high level of transparency to eliminate complaints from consumers.  Remember, all it takes is some savvy lawyer to get a single citizen to file a complaint.  To prevent endless litigation and legal costs that effectively eliminate the economic value to the "fast lane",  service providers need to provide this high level of transparency to avoid an FCC mandated higher level of transparency.

SPs should freely adopt a level of transparency that satisfies consumers and their advocates and limits the level of proprietary disclosure to their competitors.  They should ensure the "fast lane" does not, by design,  effectively harm all other traffic.  This can occur unwillingly using standard IETF IP Networking Protocols.

Therefore,  I believe the "Open Internet + Fast Lane" approach is worth implementing. It ultimately benefits the consumer and it's fair to the service providers.  Yet, and a "BIG YET", the FCC must Trust and Verify.

To comment on this or to discuss this in more detail please contact me at gwhelan@greywale.com

For additional articles and analysis please visit www.greywale.com

Tuesday, April 15, 2014

Is VoLTE Worth the Investment?


Mobile network operators across the globe are moving to deploying VoLTE (Voice over LTE) systems.  The reasons for this are as expected.  They include:
  1. Better voice quality
  2. Protect voice revenues
  3. Leverage IMS investment
  4. Be able to provide billing services (Leverage their billing system)
  5. Migrate 2G and 3G voice services to LTE


However, given the successes of over-the-top (OTT) services over wired broadband and in current wireless networks is this billion dollar investment prudent?  Consider the following:
  1. OTT has won, or is winning, the battles.  Skype and Netflix are two good examples.  WhatsApp is another case in point of OTT success. 
  2. OTT services are “good enough”.  The majority of the market is unwilling to pay for QoS when a free, or near free, service is sufficient.
  3. When a consumer experiences poor quality for an OTT service they blame the service provider and not the OTT provider.
  4. QoS can only be guarantee when the “call” is completely on-net.  As soon as the voice call leaves the originating SP all bets are off.  Why spend the $ billions only for a subset of calls?
  5. Voice revenue and now SMS Text revenues are crashing.   Why spend $ billions to chase a losing battle?
  6. Service providers are moving away from call-based billing (i.e., CDRs (Call Detail Records)).  Although the NSA loves them.  Unlimited calls and texts are the norm. 
  7. Data limits are prevalent.  A few Youtube videos or one Netflix movie will dwarf a month’s worth of phone calls from a data usage perspective. 
  8. When Google deploys their fiber optic networks (e.g., Kansas City) they do not offer voice services.  The reason is to avoid the mountains of regulations required when offering voice services.    Does that mean people in Kansas City don’t make voice calls?  

It is understandable why a service provider with decades of legacy voice experience would want to consider VoLTE.  After all, they have decades of legacy voice experience.   Similarly, it’s not surprising that service providers that have spent $ billions and years deploying and perfecting IMS want to leverage that investment in time, money and careers.  It’s difficult to face reality that IMS is a “sunk cost” and therefore should not be factored in when evaluating VoLTE investments.

The OTT trends and successes cannot be refuted.  Service providers continue to face the fact that they cannot compete effectively against every segment of OTT services.  The $ billions they would spend on VoLTE would be better spent on: 
  1. Increasing bandwidth per subscriber
  2. Providing industry leading network security to protect their subscribers
  3. Fighting the short sighted Net-Neutrality laws that make regulators “feel good” at the expense of long term viable markets.
  4. Creating an infrastructure where OTT’s want to pay them for QoS in a fair, open and non-discriminatory manner. 

I’ve spent 20 years working on technologies with the goal to ensure service providers do not become a “dumb pipe”1.   I am fully biased toward ensure the success of SPs and believe that “net neutrality” is unfair to them2.  However, SPs should not spend $ billions on VoLTE just because it’s voice.  High speed, high quality broadband is the future.  Don’t fight it.

To discuss this please contact me at gwhelan@greywale.com


 Notes:       
  1.  http://greywhalemanagement.blogspot.com/2012/07/if-sps-become-dumb-pipe-everybody-loses.html
  2.  http://greywale.blogspot.com/2014/02/net-neutrality-overruled-win-for.html


 Click here for an INDEX of Articles and Post

Monday, February 10, 2014

IoT? Internet of Things....What is a Thing?


Internet of Things, or IoT, is a topical conversation these days.  Companies with vested interest, such as Cisco, have announced this market to be $Billions and $Billions in the not so distant future.  

The word “thing” is a good one here.  You can add “no” and “every” to the front of it and get other proper words.  So IoT can mean “nothing” and “everything”.   That exactly what it means today


A market of nothing and everything is not a real market.  It’s either a ZERO billion dollar market (nothing) or an infinite billion dollar market (Everything).   Zero dollar markets don’t sell market research reports and space at trade shows.  So the industry tends to favor the infinite dollar market.  So we see reports of IoT being a $19 TRILLION market (Cisco), $14 to $33 Trillion (Mckinsey) and a mere $2 Trillion market (Gartner). 

We’ve seen this movie before.  In the 1990’s the market for “Multimedia” was predicted to be many billions and more recently we hear the market for “Cleantech” will be multiple billions.  Yet, like the term IoT, these words meant nothing and everything

When asked what multimedia applications were the answers were always video editing, video conferencing, training and kiosk.   Not sure about “kiosk” but the other three are not multimedia applications they are specific identifiable markets.

Similarly, what are cleantech applications?  Energy efficiency, renewable energy and smart grid are often the answer.  Here again, these are not cleantech applications, they are specific identifiable markets. 

So let’s drop the hype around IoT and start talking about real markets that combine sensors, IP networks and analytics.  I almost said “Big data”, but that’s another “nothing” and “everything” market.

For further discussion please contact me at gwhelan@greywale.com

Click here for an INDEX of Articles and Post

Wednesday, September 18, 2013

A Telco Energy Strategy Should Demand ZERO Impact on Service


 As energy strategies reach the boardroom, service provider management should insist on “zero-impact” on services.  The stakes are too high in the competitive zero-sum game they participate in.  Customer satisfaction, reduced churn and a strong brand are paramount in this environment.  By treating energy as a strategic initiative they will achieve the benefits of lower OPEX, enhance brand and more efficient end-to-end operations.  Tactical energy initiatives will not get funded if they have a perceivable adverse effect on consumer and business services.  These adverse effects could be short lived, as during installation, or long term, if, for example, latency is introduced.   Thus, their energy strategy should demand zero impact on services.

Note the emphasis on “services” instead of “network”.    It would be unreasonable to demand zero impact on the network if you are deploying a new architecture or energy aware protocol.  Yet, with IP (Internet Protocol) the impact on the network should not cause the perceivable impact on services. 

Is zero-impact unreasonable and wouldn’t “minimal impact” be a better goal?  The challenge here would be to define what “minimal” means?  Would it mean X amount of video anomalies per 30 minutes?  Why not X+1?  Would it mean Y dropped calls/tower/minute?  Why not Y+1?  Also, who defines X and Y? Would the CEO, CTO, or CMO define them?  Would international standards organizations set them? 

 Setting the goal of “Zero Impact” sends a clear message throughout the organization of what is expected.  Terms such as “sustainability” and “green” will have clearer meaning.  Green projects that make people feel good but have no financial justification will fail fast so the real winners can progress.  Therefore, telcos and service providers should demand Zero Impact on services.

Contact: Greg Whelan at gwhelan@greywale.com to discuss.

Click here for an INDEX of Articles and Post