With the pandemic led remote work-style, the conventional defense infrastructure that agencies had spent decades constructing, and working towards, was thrown into disarray. Most agencies adopted the short-term solution of going for more virtual private networks (VPNs), which increased latency. Too many agencies use unmanaged web servers (used to expand internal services and applications reach) that fail to receive essential software updates. IT professionals should follow a long-term plan when choosing cloud services. The cloud radically shifts how companies view technology design and versatility. Data safety, Transparency and behavioral analytics are essential building blocks for potential architecture versatility, without compromising on security.
Data Safety
Regardless of where the users are coming from, focus on their activity instead. You need to design the architecture to work efficiently no matter where your workers are situated. The architecture should be adapted to the risk of each location. To that end, agencies can enforce a zero-trust architecture, which allows users to be authenticated all the time. That's not quite right, as it's not an all-or-nothing approach but an adaptive one. Protection is improved when the data's risk level is used to monitor access.
Transparency
Security must never be an afterthought when using the cloud. A cloud-enabled architecture must provide enough monitoring and control. Don't just chuck on-premise solutions into the cloud – which servers a different purpose. The use of reverse proxies and APIs is needed as well. Teams must additionally include unsanctioned clouds in their consideration. Many workers are using private Zoom calls, Slack networks, and Dropboxes for work because of mandatory remote work. What do agencies do when every account is held by a different person? Transparency is important. Agencies must know who, where, and how they are being accessed.
Behavioral Data Analytics
Most agencies concentrate on external threats but seldom look at what's happening internally. In order to minimize errors, it is necessary to watch what people do. Behavioral indicators are important in identifying consumers. Red flags become obscure if you just pay attention to threat intelligence. Other Indicators of Behavior include a data hoarding or downloading alerts. To have access to a lot of info, indicates a compromised account. Negative and anomalous activity can be identified by careful monitoring of mail, chat, web proxies and other channels. Behavioral analytics must be part of a zero trust architecture, which needs insight into approved and unsanctioned clouds.
Many organizations had invested in continuity of operations, but their plans assumed an attack or outbreak that would require them to relocate to another location. Now, organizations are forced to re-evaluate how their employees perform.
Cloud Computing, fundamentally, is all about change. Needless to say, the ever increasing Cloud usage is catering to soaring Cloud bandwidth usage.
|
A strong demand for bandwidth is generated by the new generation of cloud apps-not only for the user, but also within the data center and between data centers. You have a lot of bandwidth when you have data centers running digital systems, not only between the clients and the data center, but also a lot of east-west traffic, i.e. you have a lot of traffic going between servers.
Server technology innovations also drive demand for bandwidth. These improvements include faster processors that can go through more data faster; smart NICs that offload work from server processors, freeing those processors for more data-intensive work; and FPGAs that enable the NIC to handle work like encryption itself.
All of these advances increase the amount of information that can be processed by the network, requiring increased bandwidth to meet that demand. The data center's bandwidth is rising meteorically. The Global Cloud Index of Cisco predicts that data center traffic will triple in 2021.
Vendors of networking are stepping up to meet this demand. We saw the first 400Gbit / s data center systems tested at the recent Optical Fiber Communications conference for up to 2 km distances based on QSFP28 modules, as well as other innovations to feed data rates of the next decade. We see solutions emerging for 600 Gbit / s and 1.2 terabits for data center interconnection. Nonetheless, bandwidth demands in optical technology lead to a revival as vendors try to fill the need.