Edition 15: Dropbox moving away from AWS, Data breaches & tokenization, Data Analytics free resources, and much more
Hello Friends,
Hope you are well.
This week we are covering 5 topics.
What can we learn from dropbox moving away from AWS: a case study
Data Analytics: 2 Awsome free resources
Data Breaches & a case for tokenization?
8 Pitfalls to avoid while doing Security Assessments
4 new Target Areas for growing cyber-attacks.
Let's go.
What can we learn from Dropbox’s move away from the public cloud?
Dropbox had 500 million+ users, 200k+ business customers, and 500 Peta Bytes of data on AWS. Then they decided to move their infrastructure out of AWS. Why?
Dropbox grew fast. Their business and quality of the experience were dependent on someone else. 3 reasons which led to their move were business experience, control & cost. Once they moved, they gained complete control of the user experience and performance requirements. They got complete insights into their network operations. So much so that it became a sales tool for them. and finally, they achieved their OPEX goals and reduced costs which were a host of tracked metrics.
So, is the repatriation is for everyone. No. The decision depends on a host of factors - business margins, tech maturity, volume and scale, and a clear business case.
Data Analytics- more free resources
I published a list of a few free resources on Advanced analytics in the past editions of the newsletter. Here are a few more good resources.
Excellent website for data engineering: I learned quite a bit from it. The website has tons of free tools and resources. It offers books, courses, and recommendations across a variety of topics. from SQL to Databases to pipelines to coding in python. This one has it all.
$0 Course in storytelling with data: I am a visual person. Storytelling with data has always caught my attention. I recommend everyone to take advantage of this course and make it part of their learning. If you chose to, you can also pay to receive a certificate once you finish this course, otherwise, it is free.
Happy learning.
Making a case for tokenization to protect from data breaches
With increased data breaches, is tokenization a solution everyone should consider? There are some clear purposes for tokenization.
Purpose #1: It reduces the compliance burden. You save tokens instead of encrypted but sensitive data. hence the burdens of compliance requirements are much lesser.
Purpose #2: It makes data accessible on a "need to know" basis. select a few to get de-tokenization access.
Purpose #3: Sharing of data becomes a breeze. It applies when sharing data within the organization for analytics. It also applies when sharing data outside the organization.
But, tokenization can become costly and complex. So, your business and data environment govern the tokenization decision. There are many factors at play - Cost, Solution options, structured vs unstructured data, performance requirements, deployment architecture, and integration with your data lake or data pipeline.
Topics worth recapping from the week
8 pitfalls to avoid with Security assessments
Cyber Security assessments are your friend to uncover blind spots.
If done well they can bring in significant value.
Here are 8 pitfalls to avoid Security assessments working for you.
Taking on too much scope. Complex organizations should take a piecemeal approach. How do you know the scope is too much? Spending months in an assessment is a red flag.
Starting every assessment from scratch. Every vendor comes in with its method. Starting from scratching means a lot of wasted time for your key resources. Use your past assessments to help provide background and key information. Let those reports act as a benchmark.
Assigning aggressive dates to remediations and initiatives always backfires. Teams overestimate what they can do in short term, and underestimate what they can do over a long period. Add buffer for retests, delayed deployments, or other priorities. Life happens.
Choosing the wrong assessment models. Using a framework of NIST/CSF, C2M2, and RC3 without defining the best fit can come to bite later and cause rework.
Failing to take recommended actions or remediating steps. It is common to see follow-through actions are not taken after finishing assessments.
Not having relevant stakeholders onboard the purpose, approach, and outcomes of assessments. Core stakeholders should be on board and committed to successful remediations.
Risk getting lost in detailed reports. The risk and core message can get lost in detailed reports. Experts who can write and present well hold the key to delivering the message.
Not Associating $s to risks and hence not getting the right attention. Leaders know there are risks but can not comprehend their severity. why? there is no quantification. This creates a perception and communication gap between executive leadership and management.
4 New target areas of growing cyber attacks
We are working remotely more than ever. Organizations rely on technology more than ever to link users, applications and data. Malicious actors continue finding new ways to gain unauthorized access to the network.
Here are 4 new target areas of growing cyber attacks:
Linux systems. Considered safe but is becoming a new target. Linux runs the backend for networks and container-based solutions. There is malware that can target Linux systems with remote access.
Satellite-based connectivity to connect remote OT devices. 8 Million Americans use Satellite communications for internet access. A wide variety of industries use satellite networks.
Crypto wallets - bitcoin addresses, private keys, etc. These wallets are becoming valuable. hence a prime target for cyber attacks.
Critical infrastructure ( pipelines. water supplies.) These impact a lot of people. These are prime examples of IT and OT convergence. We saw these in the case of colonial pipeline cyber attacks. We have seen attacks on water facilities, and many global examples exist now and are ramping up.