Slot Gacor
Cloud Archives ✔️ News For Finance
Home Archive by category Cloud

I’ve been aware of San Diego-based XR company Campfire since CES last year, where I swore to secrecy in order to gain access to its first AR demos. Since then, Campfire has quickly matured into a complete solution for professional 3D design and collaboration —one that explicitly seeks take advantage of existing platforms to advance and improve AR and VR. Campfire has already raised $8M in seed funding and its platform is on pace for commercial availability before the end of this year. However, while Campfire could easily be mistaken for just another AR or VR headset, it is much more than that. Let’s take a closer look.

The headset

If Campfire’s headset looks familiar, that’s because a lot of the headset’s IP comes from the defunct Meta View AR headset which shut down in 2019 and sold to what would become Campfire. While Meta View’s headset was ahead of its time, the company failed to deliver functioning products on time and had many bugs, especially tracking. Campfire picked it up and started all over again. There are five components to the Campfire platform: the Campfire Headset, the Campfire Console, the Campfire Pack, Campfire Scenes and the Campfire Viewer.

The Campfire headset features a resolution of 2560 x 1440 with a 60Hz refresh rate. Its 92-degree FoV (field of vision) is capable of supporting both AR and VR applications with the quick switching of magnetically attached lenses for translucent (AR) or opaque (VR) use. While the original intention of this headset was AR, the ability to support VR is a good thing for its overall utility. While this means it could technically be classified as an XR headset, Campfire prefers to call it a Holographic Collaboration System. A direct-wired connection from the Campfire headset to the PC is necessary in order to provide the power and reliability required by professional users in the enterprise. While I believe that in time Campfire will likely move towards a wireless solution via WiGig (or some other mmWave solution), for now, that technology is not reliable enough for enterprise applications.

The console and controller

The Campfire system uses a tabletop ‘console’ for tracking and object permanence, but I expect that there will be more applications in future. The Campfire headset features inside-out tracking but doesn’t utilize any RGB cameras—a design choice that could make it particularly well-suited for sensitive defense applications. The company also utilized this tracking system to develop a ‘Pack’ that attaches to any existing smartphone and turns it into a Campfire controller. I like this approach for inputs. Campfire takes advantage of hardware that users are already familiar with, and combines it with its own internal sensors and touchscreen. It’s a strong proposition.

The software

In addition to Campfire’s three hardware pieces, there are also two crucial software components that round the platform out. First, you have Campfire Scenes, which enables users to create scenes from existing 3D models for quick reviews. While this task traditionally necessitates powerful PC workstations, this is not the case with Campfire Scenes. This software allows companies, engineers and artists to build 3D models of products using the industry-standard software and applications they have always used.

In addition to Campfire Scenes is the Campfire Viewer. This offering enables multiple users to collaborate in the same space during video calls using a Campfire Headset, tablet or smartphone, making design reviews and other forms of spatial collaboration much more manageable. Interestingly, Campfire does not handle any audio. However, this makes total sense when you consider how many enterprise customers don’t want yet another service for voice communications to have to certify and qualify. While this may not always be the case, I believe that many companies are happy to take advantage of their preexisting communication platforms. This decision shows that Campfire wants to fit into existing professional 3D workflows rather than moving them in an entirely new direction—a unique approach. Campfire also shared that it is leveraging a major cloud vendor for secure collaboration, but hasn’t identified the company by name as of yet.

Lastly, Campfire has its own enterprise management console, which means that IT departments won’t have to onboard new users and create new identities and accounts. Instead, IT departments can use their existing domains and identities to log users into their respective accounts.

Hands on experience

I’ve gotten to try Campfire in its many different iterations over the past year, as it evolved from a headset and controller into a complete design and collaboration solution. All iterations of the Campfire headset I’ve tried were fully-functional 3D-printed prototypes which gave me an idea of what the headset would look and feel like. Every iteration of the headset was more refined, more comfortable and easier to use than the previous one. The Campfire team has also considerably improved the user experience and nailed the concept of enhancing workflows rather than disrupting them. The image quality is still top-notch and I would believe has enough resolution to be used for enterprise applications.

Campfire’s approach will also utilize a monthly subscription model for the complete solution, rather than parting out different pieces of the platform for different prices. While the company has not yet disclosed the all-inclusive monthly price, I expect Campfire is targeting enterprises that can afford it. That is to say, I expect the pricing to be accessible for many medium to large businesses. Campfire has already announced its deep involvement with Frog Design, one of the world’s leading design firms and the company behind the design of Campfire’s hardware. While I do not expect Campfire to sell the headset without the entire platform solution, if it did, I would expect it to cost roughly $1,000—already way outside of what consumers would pay.  In the long term, I could see Campfire offering upgrades to subscribers that leverage technologies like wireless or hand tracking.

The path forward

While the Campfire platform makes use of some of Meta View’s technological capabilities in the headset, it solves many of the problems that early AR headsets, like Meta View, had. While some had amazing optics, image quality and even a great field of view, in the end, most of them completely failed to deliver practical usage within existing workflows. Professional 3D designers and engineers don’t want to completely disrupt their existing workflows to take advantage of spatial computing and Campfire seems to get that. I believe that Campfire’s approach will resonate with certain segments of the market, such as defense, that are very sensitive to camera-based tracking technologies. I think the platform will also be attractive to companies that don’t want to have to reinvent the wheel in order to work in professional 3D environments.

An early access pioneer program for the Campfire platform is now open and will have commercially availability in Q4 of this year. Considering where the market is today with enterprises snapping up headsets for remote collaboration and design review, Campfire’s complete solution couldn’t have come at a better time. While Campfire is not alone in this market, it is by far one of the most comprehensive solutions I have seen to date. I believe it will be ready for enterprise deployment this year.

Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry. I do not hold any equity positions with any companies cited in this column.

Over the years, Amazon has been steadily building a product line that supports its goal of increased participation in the Amazon ecosystem. The company has built, bought, partnered and launched tools that expanded merchant participation and activity, number of customers, increased cart and checkout size and reduced friction on both the buy and sell side.

Seemingly unstoppable by regulators or competitors, the company is armed with numerous patents, virtually unlimited cash, a massive, devoted customer base and unending data. With this, Amazon could represent a real threat to traditional banking. However, Amazon remains very focused on building financial services products that support its core strategic goal: increasing participation in the Amazon ecosystem and solving inefficiencies for its 310 million active customers, 100 million Prime customers, 50 million Echo owners and 5 million sellers worldwide (according to company data).

Amazon has also made several fintech investments to support its core strategic goals. All of this points to the conclusion that the company isn’t likely to build a traditional deposit-holding bank. Instead, it seems focused on taking the core components of banking and using them to best support its merchants and customers. As a CB Insights study summarized, “In a sense, Amazon is building a bank for itself — and that may be an even more compelling development than the company launching a deposit-holding bank.”

Still, there are several categories where traditional banks shouldn’t rule out Amazon as a significant competitor. With Amazon approaching $1 trillion in market capitalization, it can’t be ruled out of any race. The company has long been experimenting with financial initiatives, including partnering with major US banks that offer accounts in the online marketplace. All the while, the company continues to expand and grow its existing financial offerings. As the Office of the Comptroller of the Currency moves to introduce policy that will level the playing field in banking, I believe fintechs, big tech and startup disruptors may have a more competitive advantage.

Amazon’s DNA is to be the platform. The company is rooted in distribution, integration, logistics, convenience and instant gratification. When Amazon applies those roots to financial services, it can help financial institutions process, underwrite and service loans at a lower cost than what banks currently incur while fulfilling a higher demand. The company has no reason to be the lender in this case. It simply takes a cut of the FI’s business while offering vertical ancillary solutions like KYC and AML at an additional cost.

Beyond banking

Just Walk Out, a biometrics payments technology pilot program with Amazon Go stores, allows customers to purchase items from the store without checking out. The company is likely to push the technology out to other Amazon brands such as Whole Foods as soon as Q2. Still, it could also be expanding outside of the Amazon lair to other grocery and retail chains.

The company has also started testing Amazon One, which uses palm-scanning technology to eliminate the need for any physical payment method. Consumers can also tell Alexa to Pay for gas, movie tickets and utility bills, through partnerships with fintechs and other companies. According to CB Insights ‘ Industry Analyst Consensus, the global voice shopping market is poised to grow from $2B to $40B by 2022.

Amazon isn’t the only player in the automated retail technology space. AiFi makes retail stores autonomous with its compelling camera vision platform. Through its unique approach of partnering with retailers, AiFi can provide a hybrid experience tailored to stores’ existing customer bases (users can still opt for a traditional cashier). For example, just this last week the company announced the opening of a 5,000 square foot autonomous store in Denver, CO, a collaboration with Choice Market. While in total, AiFi has ten live locations to Amazon Go’s 29, it’s delivering healthy competition, and purportedly a better customer experience.  

Wrapping up

While Amazon has had numerous product pivots and failures along its way, it still isn’t afraid to iterate as it moves from e-commerce to omnichannel enablement. Next-generation bank? Probably not. But one way or another, Amazon is likely going to use its vast data, distribution and enablement to change the way its customers experience banking.

Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry. I do not hold any equity positions with any companies cited in this column.

Today, Oracle announced with Tanium that it had selected Oracle Cloud Infrastructure (OCI) as part of its multi-cloud approach to delivering its SaaS platform, Tanium-as-a-Service (TaaS).

Tanium is in the business of providing endpoint management and security, trusted by many large enterprises worldwide. Moor Insights & Strategy recently published a research paper discussing the importance of endpoint management and security when work has moved to the cloud, home, and everywhere in between.

In my conversation with Orion Hindawi, cofounder and CEO of Tanium, I was intrigued to learn how he approached multi-cloud and the criteria he used to evaluate cloud providers. First and foremost, he has to maintain a high level of trust with a most discerning customer base.

The time is right for Tanium-as-a-Service (TaaS)

Customers, historically, have deployed the Tanium platform on-premises to perform both security and operations management for a range of different devices with chips from IoT up to the server, including virtualization and cloud computing.

About three years ago, the customer perception of the cloud made sense for Tanium to offer a cloud-based solution. The largest enterprises still wanted to deploy on-premises at that time, but early adopters were getting comfortable with the cloud-hosted solution known as Tanium-as-a-Service (TaaS). In the past six months, Tanium sees 85% of new customers choose TaaS instead of deploying on-premises. One of the reasons customers like TaaS is the visibility of all assets on-premises and the cloud-hosted regardless of location.

The work from home phenomena has fueled the adoption of TaaS. Enterprises realize a persistent need to manage devices essentially out in the wild and not on a VPN. With the tasks to manage and monitor everything quickly, customers recognize that Tanium will do a better job managing a cloud service versus building everything from scratch.

Multi-cloud makes sense but comes with a cost

The main reason that Tanium went for a multi-cloud approach is to realize measurable benefits from one cloud versus another. As a vendor that supplies a cloud-hosted solution, multi-cloud is necessary because limiting to a single cloud provider is not optimal for Tanium or the customer base.

But leverage comes with a cost. The option to pick and choose the optimal use from each cloud provider can result in a better and less expensive service for the customer. The cost of this flexibility is to reimplement things like authentication that have to be done differently in the different clouds.

Tanium moved the TaaS application wholesale from AWS to Oracle. This required real engineering work to achieve the migration as there were parts to the application that relied on specific AWS functions.

OCI is an entirely new infrastructure developed from the ground up with no resemblance to its predecessor. The design goals were better performance, pricing, and—above all else—security. Other more established cloud providers are now investing enormous amounts of money in rebuilding the management stack.

The challenge is compatibility across clouds. If you use something that is not in a particular cloud, there is no guarantee that it will work. To shift a workload often requires reengineering the management construct, and there are network and data implications.

Why Tanium chose Oracle Cloud Infrastructure (OCI)

Tanium’s CEO said the company evaluated all cloud providers against a matrix of requirements that included the security posture, cost structure, and ability to scale. Additionally, the cloud provider needed to understand Tanium’s requirements and be willing to collaborate on extending existing services to deliver more value.

Tanium is in the business of security. Apart from Oracle’s evident security pedigree, Tanium was impressed by Oracle’s investment in hiring strong people into the security team. From the standpoint of cost, Oracle was committed to making the partnership cost advantageous to Tanium. Oracle was able to demonstrate scalability with Zoom as a great proof point.

The other advantage for Tanium was the mutually beneficial partnership. With Oracle’s focus on the enterprise, there is an incentive to show customers how OCI works better with Tanium.

Tanium is in production on OCI today

Migrating to a new cloud provider is not trivial. It requires engineering work and notification to customers. The benefit is to increase the quality of service. Feedback from Tanium’s customers on the migration to OCI has been overwhelmingly positive. Customers using the cloud-hosted service are on OCI today.

Wrapping up

I have followed the evolution of Oracle Cloud Infrastructure (OCI) and have written several articles on the subject. I was a skeptic in the early days, but I became an admirer of the Oracle Gen2 architecture redesigned from the ground up over time. This decision by Tanium with a set of demanding customers is a strong endorsement that Oracle is doing things right.

Note: Moor Insights & Strategy writers and editors may have contributed to this article. 

Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including 8×8, Advanced Micro Devices, Amazon, Applied Micro, ARM, Aruba Networks, AT&T, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Digital Optics, Dreamchain, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Google (Nest-Revolve), Google Cloud, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Ion VR, Inseego, Infosys, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, MapBox, Marvell, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nuvia, ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Poly, Panasas, Peraso, Pexip, Pixelworks, Plume Design, Poly, Portworx, Pure Storage, Qualcomm, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak, SONY, Springpath, Spirent, Splunk, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity, TensTorrent, Tobii Technology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zebra, Zededa, and Zoho which may be cited in blogs and research.

Following Cisco’s signature Live! 2021 event in late March, I recently had the opportunity to speak with Jonathan Davidson, who leads the company’s Mass Scale Infrastructure (MIG) group. During our one-on-one, we discussed several topics, including the recent Acacia acquisition, 5G and its private cellular networking opportunity and the book he co-wrote nearly 20 years ago! It was an engaging conversation and one that I believe reveals much about the success of Cisco’s service provider business unit.

The early years

Mr. Davidson began his career at Cisco in early 1995, through the acquisition of Combinet, an ISDN access company. While he got his start answering phones for the company’s TAC troubleshooting networks, over a decade and a half, he rose to solution engineering and product management roles. During that time, he even found the time to co-author a book related to Voice over Internet Protocol (VoIP) fundamentals (and edited another associated with the deployment of VoIP). If interested, you can find the first book here. I am impressed by this accomplishment—I’m currently writing my first book, and I’m learning just how challenging it can be.  

Mr. Davidson left after his first tour of duty at Cisco to pursue an opportunity at Juniper Networks that allowed him to enter the executive management realm. While at Juniper, Mr. Davidson managed several initiatives, including routing, switching, security, and serving as the lead for its campus and data center business unit. Eight years later, he returned to Cisco in the fall of 2018 to lead the service provider business after Yvette Kanouff’s departure to pursue working with start-ups. Flash forward to the present, and Mr. Davidson has been instrumental in helping to close the Acacia acquisition that brings with it the opportunity to strengthen Cisco’s optical networking portfolio.

The potential of routed optical networking  

Mr. Davidson and I also spoke about IP and how it has driven better economics over time—especially in optical networking. Cisco announced its intention to acquire Acacia in July 2019 and eventually closed the purchase earlier this year. Despite having to renegotiate the purchase price significantly, from my perspective, it was a smart move for the networking giant. Cisco is one of only a handful of infrastructure providers that offers a complete optical networking portfolio, and Acacia complements it with pluggable coherent optics. Pluggable technology is compelling because it can simplify network operations, reduce complexity and provide better investment protection through modularity. The resulting routed optical capability also offers significant capital and operational expense savings while providing more bandwidth per link.

The old technology adage of “faster, cheaper, better” is certainly applicable for the combination of Cisco and Acacia. I recently wrote on the integration of Acacia into the Cisco portfolio following the Cisco Live! 2021 event in March. If interested, you can find that article here.

Mr. Davidson and I spoke about the importance of allowing Acacia to continue to serve its large customer base in China and the rest of the world. This stipulation was one of the requirements with China’s final approval of the merger, and from my perspective, it is good for optical networking in general. Providing customers with more choices keeps pricing competitive, but it also stimulates innovation by having more participants that are vying to stand out in the ecosystem. 

The 5G opportunity

During our conversation, Mr. Davidson and I also discussed the opportunity tied to 5G. The Cisco MIG team is involved in many 5G initiatives that span private wireless, Open Radio Access Networking (OpenRAN) and cloud-native architectural approaches. These initiatives aim to help operators reduce costs, mitigate risk, scale deployments and find new monetization opportunities. The latter is compelling given Over the Top (OTT) mobile application providers in my mind did a better job of monetizing services over 4G LTE cellular network connections than the operators that funded the deployments. While the past was mostly about access and adding unlimited subscriber data plans, operators cannot run the same playbook with 5G.

The enormous expense involved in deploying 5G infrastructure and purchasing increasingly costly spectrum makes the old way of doing things untenable.

From a private wireless standpoint, Mr. Davidson views the opportunity as a multi-route to market, but the key to its success will lie in its simplicity. I agree – enterprise network professionals have years of experience with Wi-Fi, but it has been relatively straightforward with its use of unlicensed spectrum. Private cellular networking requires the use of licensed spectrum and topologies that knit core to RAN hardware and software. It is an entirely different animal. If Cisco can make it as simple as the Wi-Fi deployment and operational experience, it will be a winner with specific use cases. I personally believe that most use cases for private cellular are within non-carpeted operational technology (OT) environments—those that have been traditionally unconnected or served by connectivity solutions that do not scale or easily talk to one another. On the other hand, I expect that Wi-Fi will continue to lead the deployments in information technology (IT) environments given the existing install base and Wi-Fi 6’s improvements in speed, latency and device density.

Wrapping Up

Cisco has brought a complete optical networking platform to market through a combination of organic roadmap development and acquisitions. Al Gore once staked his claim to the Internet, but clearly, Mr. Davidson and his MIG organization can make a better argument for their involvement. Cisco is on a mission to deliver the Internet for the future by providing the needed headroom and scale to accommodate the increasing traffic and proliferation of data (and its exchange). It is an audacious goal, but one that I believe the company is well-positioned to deliver on.  

Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including Cisco Systems, cited, or related to this article. I do not hold any equity positions with any companies cited in this column.

A digital platform is any network that facilitates connections and exchanges between people (exchanging goods, services, or simply just communications). Facebook, Uber, and Airbnb are all examples of digital platforms, giving you an idea of the magnitude of this trend. In fact, digital platforms are turning established business models on their head – just think of the impact Airbnb is having on the travel sector-leading many traditional, non-tech businesses to consider transitioning to or incorporating a platform-based model.

But why are digital platforms so significant right now, and what can your business do to prepare for this transformative trend? Read on to find out.

Why are platforms such a key trend now?

This notion of facilitating connections between one party and another is hardly new. If you think about it, a newspaper is a platform that connects advertisers to readers. Or a shopping mall is a platform that brings consumers and different brands together under one roof.

What’s new about today’s most powerful platform businesses is that the connections they facilitate take place online. In other words, digital platforms draw upon related technology trends, such as mobile devices, artificial intelligence, big data, cloud computing, and automation. All these tech trends have combined to create a perfect storm, giving rise to a new wave of highly successful digital platform businesses. That’s what makes this trend so significant right now.

5 questions to help your business leverage the digital platform model

If you think digital platforms are just for tech companies, think again. Across many different sectors, organizations are investing in digital platforms – be they platforms of their own making, or collaborations with existing digital platforms. This revolution is impacting all kinds of companies, from small businesses and nimble new startups to large corporations with a more traditional pipeline business model.

That’s why I believe every company should have a platform strategy. However, for many businesses, transitioning to or incorporating a platform strategy is far from an easy, overnight transition. Digital platforms often mean a fundamental change to business models and strategy, which means you’ll have to think carefully about how best to leverage the platform model to drive success.

The following five questions will help you do just that.

1. Where’s the value?

This concept of creating or capturing value is essential to the success of any platform. Therefore, a good starting point is to ask how your intended users would benefit, both from the platform itself and by connecting with others in the platform. At this stage, many businesses try to position themselves as “Uber for [insert product here],” but I believe it’s more important to focus on your unique value proposition as a business. To put it another way, what is it you do best, and how could a platform help you build on that success?

This doesn’t necessarily mean abandoning your existing business model. Rather, there might be opportunities to create a platform-powered additional revenue stream or service. As an example, rail transit company Siemens Mobility, part of Siemens AG, has created the Easy Spares Marketplace – a platform that brings together manufacturers, dealers, and customers, allowing users to order all the spare parts they need in one place.

2. Do you have the necessary platform skills?

Many traditional corporations simply don’t have the expertise – or even the culture – to seamlessly adopt a platform business model. This means you may need to look outside the company to the world of entrepreneurs and tech startups and potentially create a joint venture to access the skills you need.

A great example of this comes from Nissan, which has been in talks with Didi Chuxing (China’s version of Uber) to create a ride-sharing service centered on Nissan’s electric vehicles.

3. How will you attract people to your platform?

Without people, platforms fail. Facebook, for example, relies on its community of users to generate and post the content people want to read and see. Just as Airbnb relies on attracting people with homes and rooms to rent.

The community is critical to the success of your platform. You, therefore, need to work out how you’ll “seed” users to your platform – and this may involve offering free services, low prices, reduced commission, or a uniquely specialized offering. Think of how Amazon cornered the market by offering books cheaper than any other bookstore.

4. How will your platform encourage and support user interactions?

Ideally, your platform would become the core of the community it serves – it needs to be the place where consumers or users connect with people who can provide the information, goods, or services they need.

This means your platform needs to encourage and facilitate valuable interactions between participants. A key part of this value comes from developing appropriate governance policies on what is and isn’t acceptable behavior on the site, in order to ensure users continue to have a great experience.

5. How will your platform integrate future technologies?

Digital platforms rely on other tech trends, but one tech trend, in particular, could prove a significant competitive challenge for platforms: blockchain technology. With blockchain, users can cut out the middleman and transact directly with others in a safe, secure way – potentially cutting out the need for middlemen platforms like Uber and Airbnb. As an example, the Arcade City ride-sharing app – which was created in response to one driver’s frustration with Uber’s way of working – is based on blockchain technology and has been described as an Uber killer.

The good news is, if you’re moving into platforms now, you have the opportunity to leapfrog existing platforms and harness new technologies like blockchain to your advantage at the outset. Ask yourself, is there an opportunity to use blockchain to create a new, more decentralized way of doing business in your industry?

The digital platform model is just one of 25 technology trends that I believe will transform our society. Read more about these key trends – including plenty of real-world examples – in my new book, Tech Trends in Practice: The 25 Technologies That Are Driving The 4th Industrial Revolution.

Cyber threats are a fact of life for nations and companies around the world. The United States government has recognized and addressed the growing risk of cyber attacks from adversaries dating back to at least 2001, when President George Bush appointed Richard Clarke as the first Cybersecurity Czar—a special adviser to the president on issues of computer security. A lot has changed since 2001—both in terms of the technology attack surface and the threat landscape—and cyberattacks have emerged as the primary battlefield in a new “Cold War” between the United States and its primary adversaries. In March, a panel of experts got together for a virtual roundtable titled “Restoring National Cybersecurity: A Look into the First 100 Days of the New Administration” to discuss the challenges we face and offer guidance for how to address them effectively.

We are nearing the end of President Joe Biden’s first 100 days in office. The first 100 days is generally recognized as a combination of honeymoon phase—as cabinet positions are filled, and individuals get acquainted with their roles and ramped up on the work to be done—as well as a significant milestone—as the nation considers the early tenor and vision of the policies being pursued by the new president. The job of President of the United States is never easy, but President Biden’s challenges were compounded by inheriting the fallout of gross negligence and incompetence by the former administration on virtually every front—from the economy, to foreign relations, to the climate, to education and infrastructure, to the urgent need to implement a functional plan for dealing with the COVID-19 pandemic and expediting vaccinations across the country. On top of all of that, the nation is facing a large and growing cyber threat from adversary nation-states and cybercriminals that can’t be ignored.

The roundtable discussion was hosted by Cybereason and moderated by David Spark. The panel of experts was comprised of Lior Div, co-founder and CEO of Cybereason, Theresa Payton, CEO of Fortalice Solutions and former White House CIO, Corey Thomas, CEO of Rapid7 and a board member of the Cyber Threat Alliance and Michael Daniel, president and CEO of the Cyber Threat Alliance, and President Obama’s former Cybersecurity Coordinator. Each person on the panel brings valuable cybersecurity expertise to the table, as well as experience addressing cyber threats from nation-states.

The roundtable was coordinated in the wake of the SolarWinds attacks that were discovered at the end of 2020. US intelligence sources and cybersecurity experts have attributed those attacks—which affected tens of thousands of systems around the world—to Russia. The agenda of the discussion was to develop an action plan that might help guide the Biden Administration as it strives to respond to these types of attacks and strengthen the cybersecurity posture of the nation in general to prevent similar attacks in the future. The discussion became even more relevant and poignant when another massive attack was revealed just days before the roundtable sessions took place. HAFNIUM—a hacker group based in China—targeted a variety of zero-day vulnerabilities to compromise tens of thousands of Microsoft Exchange Servers.

David Spark started the session talking about the budget allocated for cybersecurity in the American Rescue Plan legislation and asked Lior Div for insight on how he would begin restoration of America’s cybersecurity defenses.

Lior noted that the United States is under virtually continuous attack from Russia and China and suggested that we need to start by changing our mindset. He pointed out how the current situation is a rekindling or extension of the Cold War, but also that the objectives have shifted. “You can gather information, and you can manipulate information as much as you want. In general, I would say two things. One goal is espionage. And the other one is to control kind of what people think. And we can go back all the way back to 2016, when we had the election, when the Russians tried to influence it heavily, and even in the last election.”

David then asked Theresa Payton to weigh in on what she believes we are doing poorly now and what is the first thing we need to address. Theresa started off by offering praise and appreciation for the Herculean task that CIOs and CSOs have been faced with during the COVID-19 pandemic as entire organizations suddenly went 100% remote—simultaneously obliterating any concept of a network perimeter and vastly expanding the attack surface that needs to be monitored and protected.

Theresa stressed that super control access should be strictly limited, and that organizations should ensure users are rotating and using unique passwords. She added, “Accounts have got to be monitored with behavioral-based monitoring, segmentation of everything. So, the more you can segment everything down to the most granular level, when that data incident happens—which it will—you have the ability to go shields up and flip kill switches so that you can actually mitigate the incident and still have resiliency in the organization.”

Michael Daniel pointed out that nobody wants to get hacked, and nobody is intentionally doing cybersecurity poorly. He recommended that we need to step back and understand why it is that government agencies and private sector organizations struggle with the basic fundamentals of cybersecurity and try to figure out what we can do to improve it. For starters, he suggested that we not place so much of the burden on the end user. He explained that we expect drivers to be responsible for actually clicking their seatbelt into place when driving a vehicle, but there are other elements of vehicle safety that are automated. “We don’t have a car say, ‘Excuse me, you’re about to have an accident. Would you like me to deploy the airbags: Yes, or no?’ Like, it just does it, right?”

Corey Thomas noted that it may sometimes seem futile—especially when facing a nation-state attacker that has significantly more resources at their disposal. He stressed, though, that it isn’t just about completely eradicating the threat. There is value in simply raising the bar and making it more challenging so there are fewer attacks, or the attacks take longer to execute, or the impact of the attacks is diminished.

Lior emphasized that he spent more than 20 years of his life being on the other side—being a nation-state hacker for the West. With the benefit of perspective from both sides of the fence, he stressed that we need to stop treating nation-state attacks as being too complex or sophisticated for us to defend against effectively. “I think that that was an excuse for many, many years for many companies of saying, ‘Oh, this is nation-state. We cannot do anything about it.’ I think that by now we have the technology. We’re 10 years into that after this event. I think that there is enough innovation that we drove collectively in order to fight against them.”

It was a valuable and insightful roundtable discussion. As we approach the end of the first 100 days of the Biden Administration, the cyber threat landscape seems to be intensifying even more. Acer was reportedly hit by a ransomware attack demanding $50 million in ransom. A few weeks later, in the wake of sanctions by the Biden Administration against Russia and potentially in retaliation for that action, Quanta—a major partner and supplier for Apple—was also hit by a $50 million ransomware attack. Meanwhile, researchers found that the Prometei Botnet is leveraging the exploits from the HAFNIUM attack to target vulnerable Microsoft Exchange systems. The breaches and compromises seem to be increasing in frequency and escalating in scope and impact, so its imperative we take action quickly.

I expect and hope that the Biden Administration would seek out experts like those who participated in this panel and involve them to better understand the threats we face, and to provide guidance for how to address those threats affectively and improve cybersecurity.

Gaming software is serious. Although derided by some as child’s-play, the gaming industry is one of the most advanced sectors in the total technology landscape. Although the IT industry hype machine often concentrates on AI breakthroughs, the efficiencies of increasingly automated cloud services and the proliferation of new shiny devices, the gaming industry is quietly working to become the multi-billion dollar industry that it is.

Today, gaming has become more prominent than the movie and music industry combined and reports suggest that some 2.7 billion people regularly engage in gaming, while gamers themselves now span a wide range of age groups. The market is projected to surpass $200 Billion in revenue by 2023.

Gaming software is like enterprise software

What a lot of non-technical observers often fail to see is the degree to which games software application development uses many of the same tools, techniques, platforms and process that we see being applied to enterprise software application development every day. 

No new version of Assassin’s Creed, Tomb Raider, Red Dead Redemption or Gears of War happens without a huge amount of software code being collated and managed in a changeable repository. It is because of this that Software Change & Configuration Management (SCCM) companies like Perforce are well known for having a finger in both the enterprise software pie… and the gaming software pie, at the same time.

Games developers also use the same kinds of collaboration tools, very similar application performance management & testing functions, equally powerful security robustness & cyber protection layers and cloud service connectivity tools as their enterprise counterparts. The only difference, arguably, is that games developers have more fun with the finished product than their Enterprise Resource Planning (ERP) systems pals across the way.

But games have gone online and the rise of Massively Multi-Player Online Roleplaying Games (MMORG) has created a new technology infrastructure requirement in the shape of connectivity… and where connectivity and network transport fails the gamer, they experience network latency or lag issues. Take it from an ardent gamer, network lag is more infuriating than the final “boss level” in any of today’s top titles.

As the world moves to remote work, solving game latency or lag issues will also affect the health and performance of other work applications. Principal at Sydney, Australia and San Francisco headquartered company Telstra Ventures Saad Siddiqui argues that this whole topic is becoming a real issue. His firm is a strategic growth investor in lighthouse technologies and the team at Telstra Ventures says it is backing some projects and initiatives that could save the day.

The #1 challenge for gamers is latency

“There’s a new generation of gamers worldwide and they expect nothing short of a flawless gaming experience. Game developers and publishers are fighting to deliver fresh gaming experiences — and one of the most considerable is the need for fast, resilient networks and global infrastructure. The challenge of lag or latency is the annoying delay between a player’s move and the game’s reaction. It is often reported as the #1 problem faced in multiplayer online games – especially across continents,” said Siddiqui.

So how do we work with this issue in technical and economic terms? Games developers (or perhaps, more prevalently, the executives they work for) rank the need to solve lag as one of their top priorities. Every second lost to lag, or jitter can cost game developers millions in lost revenue. Some games including Fortnite, League of Legends and Call of Duty have looked to industry options to provide a form of “Upgraded Internet” service to deliver an optimized super-charged fast-lane for gamers and reduce that lag. 

One example in the Telstra Ventures portfolio is Subspace, a company and technology which claims to improves latency times and local network performance for gaming via a globally deployed infrastructure for hundreds of millions of gamers. As CEO and founder of Subspace Bayan Towfiq explains, “The Internet is loose federation of independent networks that was designed for resilience, but not for real time applications.” 

Towfiq balances the gaming side of network lags with more life-critical applications. He says that users in many parts of the world can not use online gaming, use play-by-play betting applications or (and the serious side, which depends on the same connectivity factors) access telemedicine services.

Weather mapping and pathfinding the Internet

“Subspace is building a parallel ‘speed of light’ Internet that supports real-time application requirements natively. We’re deploying infrastructure in hundreds of cities and from those locations, we’re using our software to ‘weather map’ the Internet and find the ‘fast paths’ before then working to pull game traffic through those paths. But we can’t just fix the Internet with pathfinding, we go a couple of layers deeper and have also built a networking stack that understands the needs of applications…  and so coordinates on a global scale,” added Towfiq.

Game publishers now want to manage their network routing and get greater visibility into global network traffic. Telstra Ventures’ Siddiqui says this is where NS1, another portfolio company, is playing a role in upping the ante on delivering a consistent gaming experience. NS1 changes the way infrastructure is deployed and then optimised. It helps companies that want to build their own capabilities and relationships with network providers to improve game performance.

“With 2.7 billion people worldwide into online gaming, game publishers are fighting to get the highest number of players and engagement. There’s a massive opportunity for companies that can provide real-time player insights via a range of digital sources to measure, for example, how people are using games or apps. This data can significantly help with marketing efforts by the game publishers. One such company is Singular, which provides a platform that aggregates all audience data into a single dashboard,” said Siddiqui.

The global uptake of 5G will push the price of gaming headsets down while also help on-screen resolution to increase, which will contribute to an even better gaming experience.

Siddiqui postulates and says that the conversation is already shifting to what 6G might look like and how it will be delivered. Another Telstra Ventures company Cohere is today working with mobile operators to increase “spectral efficiency” by mathematically mapping mobile networks – and this technology will be part of the emerging 6G conversation. He concludes by saying that this approach does to the network what Google did to the Internet: make the Internet searchable by mapping a mathematical model onto the location of all of the information on the web.

Gamifying everything in the future

Could this whole issue of online game connectivity become more of an issue in and of itself going forwards? Yes, wouldn’t (arguably) be an unreasonable answer to that question. Quite apart from gamers themselves driving this requirement, we are seeing the wider use of highly connected real-time online applications (such as the betting example earlier references) all the time.

Combine the above truism with the need for telemedicine, increased home and remote working in the wake of Covid-19 and the wider gamification of enterprise applications to encourage people to use work software in more always-on community-connected ways… and you can see why we need a cure for jet lag (as always, that’s mainly daylight, walking and lots of water) as well as newfangled Internet lag.