In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)06/02 Report--
Migrating applications and workloads to the cloud will bring many benefits such as scalability, low cost, easy maintenance, and so on, but moving data is not without risks. When the IT system or application goes down, the enterprise will pay a high price. According to ITIC, 98% of organizations may lose more than $100000 in an hour of downtime.
"by 2022, at least 95% of cloud security failures will be caused by customer errors." Said Jay Heiser, vice president of research at Gartner.
To avoid losses caused by your own mistakes, you need to understand the following seven common cloud data management pitfalls:
1. No data protection strategy
It is very important to ensure the security of company data in static or transmission. If a disaster occurs, you need to make sure it is recoverable. You need to fully consider the threats posed by malware, accidental deletions, and irrecoverable failures in your cloud infrastructure. If the worst happens, you should do more than just apologize for a refund, so it is essential to have a consistent and lasting data protection strategy.
2. No data security policy
It is a common practice for data in the data center to be mixed and co-located on shared devices with countless unknown entities. Cloud providers may guarantee that your data is independent, but regulatory issues need to be identified. Access control can be considered because basic cloud file services often do not provide the same user authentication or granular control as traditional IT systems. One research institute estimates that the average cost of global data leaks is $3.6 million. You need a multi-tier data security and access control policy to prevent unauthorized access and to ensure that your data is securely and reliably stored anywhere in encrypted form.
3. There is no fast data recovery strategy
With storage snapshots and previous versions managed by dedicated NAS devices, you can quickly recover from data corruption, deletions, or other potentially catastrophic events. But few cloud native storage systems provide snapshots or the ability to easily roll back to previous versions, which makes you more dependent on current backups. You need flexible real-time storage snapshots to provide fast recovery and rollback capabilities for business-critical data and applications.
4. No data performance strategy
Shared, multi-tenant infrastructure can lead to unpredictable performance, and many cloud storage services lack the ability to optimize performance parameters. Excessive simultaneous requests, network overload, or device failures can cause latency problems and slow performance. You need to find a layer of performance control for your data so that all your applications and users can achieve the desired responsiveness. You should also make sure that you can adapt to the growth of demand and budget over time.
5. No data availability strategy
Hardware failure, human error... For a variety of reasons, downtime is often inevitable. The best way is to respond to the worst-case scenario, create copies of the most important data, and take steps to switch quickly in the event of an occasional failure. Look for a cloud or storage provider that is willing to provide SLA guarantees for your business. If necessary, create a fail-safe option and use the secondary storage controller to ensure that your application does not suffer any interruptions.
6. No multi-cloud interoperability strategy
According to analysts at Gartner, as many as 90 per cent of companies will adopt hybrid infrastructure by 2020. As the company's demand for efficiency optimization and cost control continues to grow, there is a positive driving force behind it, but you must correctly assess your choice and its impact on your business. It's best to make sure you can easily change cloud vendors in the future, as well as any code that may need to be rewritten. Vendors will often want to tie you up by using proprietary API and services, but you need to maintain the multi-cloud interoperability of data and applications to maintain agility and selectivity.
7. No disaster recovery strategy
A simple mistake, even if the developer accidentally wrote a wrong sentence of code, may be enough to damage your data. Even your cloud provider may be hacked, resulting in the loss of data and backup. In the event of a disaster or hacker attack, maintaining redundancy and offline backup is the key to completely restarting the IT infrastructure.
Finally, saving money and reducing costs are understandable, but in the long run, this one-sided thinking may cost you more. By taking the time to develop the right data management strategy, you can greatly reduce your risk.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.