Small Services, Big Data: How Microservices Handle Big Data

Big data plays a prominent role in today’s marketplace and companies effectively leveraging that data are experiencing big benefits.

In fact, the 2017 market volume for big data was $35 Billion and is estimated to grow to well over $100 Billion by 2027. Need more convincing? Data-driven organizations are 23 times more likely to acquire new customers.

Sounds pretty sweet right?

As of 2017 53% of companies sure thought so, getting in on their piece of the deliciously lucrative big data pie.

Join those companies and start leveraging your data by partnering with KMS. We are analytics and microservices experts.  




But, as sweet as it seems, without the right architecture in place..a crust if you will, you could end up with a pretty soggy pie.

That said, we would like to tell you about microservices and why the application architecture works so well for big data.

Not familiar with microservices? No need to worry–we have already written all about it in our post Going From Monolithic to Microservices.

New call-to-action

But to give you a speed summary, microservices is a DevOps-based architecture comprised of a collection of small, independently versioned, deployable and scalable services, each designed to implement a single business capability.

Switching to a microservices architecture can help you evenly distribute responsibilities across your system, increasing the scalability, resilience, and agility of both your products and your teams.


Benefits of Microservices for Big Data

So why are these small services an effective foundation for big data?

Let’s face it, big does not even begin to describe the amount of data being produced and this exponential growth shows no signs of stopping. By 2020, 1.7 megabytes of new information will be generated for every person, per second. You read that correctly, per second.

As your company attempts to process and analyze your slice of that data, you do not want your application to become a processing bottleneck and keep you away from those coveted data-driven insights.

As such, it is important that your data processing application is easily scalable.

One way to ensure scalability is to build your application using a microservices architecture. Built for the cloud and on virtualization, by design microservices are inherently more scalable than a traditional monolith application. Given the elasticity of the cloud, you are able to store and work with large volumes of data relatively easily. As big data continues to evolve, cloud providers have the agility to respond and supports these evolutions.

This scalability is what makes microservices an ideal architectural solution when working with massive datasets.

Microservices also helps you control the quality of your data, something that can prove difficult with monolith data solutions.

The way microservices are designed, each service focuses on a specific aspect of the data. This means you can target specific points in the dataset and easily resolve an issue if one arises, ultimately giving you more control over the quality of the data than you would have with a tradition monolith application.

This organizational structure also works well with the different data storage technologies (noSQL, relational database, etc.). The independent services allow you to apply a different technology to each, depending on which technology works best for that specific part of your data.


Overcoming the Challenges

While microservices is a great solution for big data, the complexities of the architecture should not be taken lightly.

The movement to microservices is both a technical and organizational shift for any company. When working with big data, this is no different. Ultimately it takes a well prepared, DevOps oriented team to most successfully leverage microservices for big data.

Leveraging the right technologies can help and fortunately, these technologies integrate very well with microservices. For example, messaging systems, such as Kafka easily integrate. And container orchestration solutions like Kubernetes or managed services such as AWS are perfect for big data and microservices.

The blog post we mentioned earlier tells you everything you need to know about making the switch and strategies you can use to mitigate potential challenges.


Despite the hurdle of preparing your team and organization, ultimately microservices is a worthwhile investment. The architecture is becoming increasingly popular in DevOps environments and offers the scalability and control you need to handle big data.

Once your team is ready to take on microservices and dive into big data or if you simply want a more effective way to manage the data you are already accessing–we hope that we have convinced you that microservices is a strong architectural candidate.




Going From Monolithic to Microservices: How to get started

Are you exploring the benefits of modernizing your legacy application? Do you want to increase revenue, reduce maintenance costs, and expand product capabilities?

Microservices is necessary for the modern demands of faster and frequent deployment cycles. It allows DevOps to deploy only changes to the services impacted and not the entire platform like in the monolithic world.

It also helps promote higher productivity and accountability by smaller teams owning smaller services; thus, developers can quickly make changes without worrying about how it may impact other parts of the system. This is a key benefit of microservices architecture. The best part? These benefits can give you the extra step you need to get ahead of key competitors.

Caught your attention with that, didn’t we?

Once a decision has been made to modernize, what is the next step to make it a reality? What is the right approach? How long will it take? Is my organization prepared to make this journey?

Most consultants will advise that microservices is the better way to go if you are looking to modernize a legacy, monolithic application. The concept has gained a huge rise in popularity in the past few years, as seen through the many articles, blogs, and videos that have been published.

According to the Global Microservices Trend, a survey conducted by Dimensional Research, 91% of respondents say they “are using, or have plans to use microservices.” What’s more, 86% expect “microservices to be the default within five years.”

We cannot debate the direction microservices will be going … however, the devil is in the details around the approach. As you begin this journey, careful consideration should be paid to the following areas to position your organization for success in the new world of microservices.

Need a partner that will help you navigate moving a legacy platform to microservices?


Microservices is more than a technical change

Establishing microservices is not solely a tech-led adventure. If you want to take full advantage of the speed and agility microservices can offer, you are on the right path. But you need to make sure your organization is set up to run that fast. Be sure your team is able to handle the increased complexity and the different infrastructure, architecture, speed of delivery, and ways of doing things.

We recommend organizing small teams around each microservice, with full ownership and accountability for the build, testing, and support tasks. The independence and autonomy given to each team will provide them with the necessary environment to operate with improved efficiency and focus.


Study the paths others have taken

Many people dream of reaching the summit of the world’s highest mountains, but survival depends on an experienced guide to lead the way. This is very similar to the microservices journey. Seek the wisdom of those who have successfully completed this journey.

Consider one (or all) of the following as your guide:

  • All the major cloud providers (AWS, Azure, GCP) have very detailed microservices design patterns available to leverage
  • Netflix developed and shared an open source library with tools, services, and libraries they built and leveraged on their journey
  • Google shared some very compelling lessons learned from their microservices efforts and are worth paying close attention


Build a roadmap for your journey

Typically, modernization efforts require careful planning to ensure the availability and stability of the legacy system as the new microservices are implemented. We recommend a multi-phased approach to achieve the best results…

Phase 1
Assess your current system to determine the types of services to be developed and the sequence in which they will be built and deployed. Factors that could influence the starting point include:

  • Business and technical areas of concern within the legacy system that would benefit the most from the new design.
    • Example: Areas that are most susceptible to problems from a scaling or performance standpoint
  • Experience level of your teams working on the existing product
  • Targeted technology stack for the new services
  • Current data model and the effort or complexity involved to isolate and align a new data structure with the new service
  • Security/privacy requirements of NPI or SPI data within the legacy system and organizational readiness to manage that data in a cloud environment

Phase 2
Have your team build out the development and testing infrastructure, including an API gateway, and begin building out Proof of Concept code to gain experience and confidence in the new architecture.  

The API gateway is a key factor for success during the migration from monolithic to microservices, as it becomes the interface between your existing application and the new services. Everything behind the API gateway can be changed independently and not impact your monolithic application over time. This strategy will enable you to move incrementally to the new architecture but minimize the impact and changes on the existing system.

Phase 3 and beyond…
Build, test, and deploy the first microservice. Begin by introducing the service in Beta mode to your existing client base to gather feedback and lessons learned in the use, support, and performance of the new service. In true “inspect and adapt” form, incorporate lessons into subsequent service builds and incrementally grow your service catalog over time. Keep growing services until you have reached the point where you have completely migrated the legacy system to the new architecture or the desired business outcome for the modernization effort has been achieved.


Prepare your organization

To be successful in microservices, a solid DevOps understanding is required. You must have a strong team to support builds and automated tests, while also monitoring your environment in a different way than a traditional monolithic type of application infrastructure.

Collaboration tools can help with the endeavor to create a strong team. Keep track of tasks and chats with tools such as Trello, Slack, Asana, Google Docs, and many more. Don’t let lack of preparation or organization keep you from effectively working together! Your microservices migration can be an adventure–not a nightmare–but only if you take the necessary steps to ensure success.


Find a trusted partner

To help you make an informed and objective decision, hiring a consultant might be a good option. The consultant can take over the enormous task of evaluating your application. After discussing the pros and cons of the microservices, that consultant can also take on the challenges of migration.  

KMS Technology, for example, is a professional consultant company with 1000+ offshore resources in Vietnam. With agile teams in place ready to tackle even the most difficult projects, we can bring you to market faster with outstanding quality. Migrating to microservices is a long-term investment. We’re here to make that a smooth transition!


(Check out our services for more information.)

Advertising our services aside, we truly do believe that microservices architecture is the way of the future. But is it right for you? Remember to consider both the pros and cons of this development technique. If you think about all the information we have collected after (many!) years of experience and have now passed on to you, then you may be able to position yourself for a successful microservices modernization effort.

After all, trends indicate microservices will become the standard very soon. Make sure you have developed a product that can stand against the fierce marketplace competition. As technology gets better, customer expectations rise. It’s necessary to keep up with those expectations or fall by the wayside.

Top 5 Application Vulnerabilities: How to Prevent Risks

An application in today’s environment can be affected by a wide range of issues, resulting in serious damage to an individual application or the overall organization. To build a secure and stable application, you must first recognize different attacks that can make the application vulnerable.

At KMS, we make sure security is a top priority in application development. Contact us to work on your next project.


The order of the following list is based on the risk factor of each application vulnerability and is intended to help you prevent attacks from taking place.


1. Injection

The highest vulnerability risk is injection and has been for decades. Almost anything can be injected, such as SQL queries, LDAP, Xpath, NoSQL queries, expression languages, etc. It can occur whenever a user is allowed to input untrusted data via forms, URL, or anything that can be sent back to the application/system. Any new expression or query-like language can be exploited if not implemented correctly.

How to prevent:

  • Any untrusted source of data must be validated on server side.
  • Limit the input data by implementing a whitelist instead of a blacklist — you never know what might be out there that can harm your system.
  • If possible, use safe APIs and parameterized queries. Most frameworks support this process.



2. Broken Authentication

Broken authentication vulnerabilities occur when session management is not handled properly on the server side. Attackers can reuse a session ID or token to gain access. It can also occur if weak authentication or weak recovery methods are used.

How to prevent:

  • Most advanced frameworks have reliable and secured session management. If you must implement your own, ensure the session ID is randomized and the session is invalidated correctly. Never expose your session ID via URL or any visible location on the screen.
  • Implement a weak-password check.
  • It seems like a good idea to enforce users to change their password once in a while. In reality, however, this does not prevent the problem as users tend to rotate their own passwords for easy memorization. A better way to ensure password security is implementation of a multi-factor authentication.



3. Sensitive Data Exposure

Sensitive data can be exposed to attackers in many ways. The most common exposures happen when data is transferred in plain text or using weak encryption. The following is an example of data sent over an unsecured network using weak encryption (base64encode):
 GET /api/secured HTTP/1.1
 Authorization: Basic YXVkaXQtd3M6TGV4aXNAMjAxOA==

How to prevent:

  • Ensure data is transferred via a secure layer (e.g. https over http, ssh over telnet, sftp over ftp, etc). Using https might prevent exposing plain text data transferred over the internet. As https is very affordable nowadays, it’s a great option to secure a site. For internal data transfers, however, always use a secured layer as people tend to keep data in plain text for faster transmission time.
  • Stored passwords should use strong adaptive and salted hashing functions.
  • Keep your encryption algorithm up-to-date and make sure to choose a good key size for your secret. Choose your key size wise since there is a tradeoff between key size and performance.



4. XML External Entities

Attackers can add XXE in place of an XML file and send it back to the server. This can be confused with the injection vulnerability, but it’s actually a privilege abuse method used to exploit many old XML processors. Any XML processor or SOAP based web-services that has DTDs enabled can be the target. Developers should be aware of the injection vulnerability; however, XML External Entities are not commonly seen, thereby significantly increasing the risk of this type of attack.

How to prevent:

  • If possible, try to use a less complex data format such as JSON.
  • Patch/update all XML processor libraries and use SOAP 1.2 or higher.
  • Disable XML External Entity and DTD processing in your XML parser.
  • To prevent a worst-case scenario when enabling that support, make sure to implement a whitelist and have a custom implementation to validate the input.



5. Broken Access Control

Attackers can have access to unauthorized resources via different means. Most likely, there is a flaw in the implementation to prevent users from accessing unauthorized resources. For example:
 POST /api/person/1/change-password
 {‘password’: ‘new_password}
By changing “1” to a different User ID, the attacker can change the password for any given user (not just himself).

How to prevent:

  • Deny default access to all non-public resources.
  • Always check for user privilege at server side.
  • Disable CORs or minimize its usage.


Other vulnerabilities that are commonly seen and exploited are XXS or CSRF. They are not included in the Top 5 list, however, because advanced frameworks nowadays are mostly able to take care of the vulnerabilities through default configurations.

Following best practices in security coding will gain huge value in return, especially in the early stages of development. You never want to run the risk of losing your valuable and sensitive data due to flaws in implementation. Some general thumbs-up rules you should consider are:

  • Subscribe to various forums that share the latest vulnerabilities along with the CVE.
  • Always keep your library up-to-date with the latest patches.
  • Define a security process that makes it fast and easy to develop, test, and deploy applications. Automated processes that leverage tools such as Burbsuite are very beneficial.