When it comes to the development of big data, there are many risks that can come forth with it. These risks that are associated with the schedule, the quality, and the cost of big data make it difficult for software architects to design such prototypes. Hence, it is essential for software architects to always take care of a few things before they actually start building a big data prototype.
Architecting a very big prototype of the big data system is certainly a very challenging task as there is a constant change in the technology landscape. Apart from that, there are the quality attribute issues and other challenges of performance as well. So, prototypes were built in order to manage the risks that are associated with the development of big data.
Collection of Big Data
When you talk about big data prototype building; the collection, storage, and processing of the data that is collected is certainly the most important part. For the demo to be built properly the user needs to make sure that they have got the right processing of the information. For that, there are some steps that need to be followed.
The first step would be to obtain the source of the data
The next step is to load the data that is found into the desired database
Next, the database needs to be connected with the processing system
Also, the visualization of the front end should be connected to the processed cached data
This is the phase that focuses mostly on the fact that architectural analysis is not always the key to get a clear picture. While people can get a detailed picture from an architectural analysis, there are still some parts left out. That is why prototyping is essential as it allows software architects to identify a particularly new component of big data in real-time to boost up the performance.
Basically, there are three different approaches when it comes to prototyping. It is really important to consider these approaches when the architect is trying to build a prototype for the big data systems.
Throwaway Prototyping: This particular method is also known as the rapid prototyping. In this particular process, there is a rapid development of the prototype which is then provided to the end user that is on the field in order to test for performance. It is not necessary that the prototype will be a part of the software that is finally delivered. Hence the name, throwaway prototype.
Vertical Evolutionary Prototyping: This is the process in which one system or more than one come together to undergo a process of development so that their functionality can be full prior to the release. The big data systems that are developed here are the ones that get released finally for the actual users.
MVP or Minimum Viable Products: These are the products or prototypes that are the ones with only the features that will allow the product deployment. This means that the time that is spent in the iteration of the prototypes will be minimized due to these products. The main emphasis of creating an MVP is to hypothetically test the product by providing it to the users in real-time and collecting the data that is generated from their experiences. This data will be of further help when it comes to confirming or rejecting the different theories and hypotheses that developers and architects have while building the prototype.
These were some of the things that need to be considered while building a particular prototype for big data systems.
Why is Big Data Solutions Different?
Architecting Big Data systems is a complex task that challenges most data scientists due to rapid changing of technology landscape. There are other challenges as well, such as quality attribute challenges, specifically for performance, are significant. The numbers of technologies and technology families are more; most programmers have minimum experience in them. Yet there are some software architects who manage these risks with their architecture analysis skills and some apply prototyping methods. Prototyping is a must thing required with Big Data.
What is RASP Model?
Risk-Based Architecture-Centric Strategic Prototyping (RASP) model is a set of standardized questions and decision procedures and architecture analysis methods used for enhancing prototyping activities. An architect can determine whether to prototype and how to interpret the results of prototyping by finding answers to these questions.
There are many risks involved in big data system development. Software architects working on any system – big data or otherwise- have to address risks related to cost, schedule, and quality. All the risks involved are amplified in the big data context. The RASP model was designed and developed to bring cost-effective systematic risk management in agile development of big data system.
Here are Six Major Risks Involved in Big Data Systems Development –
5 V’s of Big Data –
Paradigm shifts – The primary and main effort had been shifted on coding in small data system development but now is shifted to technology selection in big data systems and “orchestration.”
The rapid proliferation and complexity in selection of big data technologies
Rapid Technology Changes
Short history of big data system development
Complex integration of latest and traditional systems
Solutions to Risks
There are three streams of activity that run by architects in parallel to get assistance for managing risk-
Architectural Design – In this phase, software architects use a variant of the ADD (Attribute-Driven Design) method, known as BDD (Big Data Design).
Architecture analysis – A stream of architecture analysis activities parallel to architectural design is involving business goals analyzing and the requirements derived from them for consistency and completeness. In this phase, architectures are able to analyze the architectural decisions. This type of analysis can be done with an Architecture Tradeoff Analysis Method (ATAM). There are other lightweight techniques to do the analysis, such as tactics-based analysis or reflective questions.
Strategic prototyping – While architecture analysis provides detailed information on technology risks, it can’t bring complete guidance. Prototyping allows architects to check and verify the real time performance of new big data components, or assess runtime resource consumption thresholds.
Prototyping decisions are significant in big data system development; therefore architects need to make them in a disciplined, consistent manner. Since the technology “Orchestration” is the main ingredient of big data system development, this type of development requires prototyping support and architecture analysis for risk management. RASP can guide this analysis and prototyping.
How to Know if Situation Demands Prototyping or Architecture Analysis?
There are some guidelines that can help experts to decide when to do architecture analysis and when there is prototyping required, and under what conditions to deploy each type of prototype. Let’s discuss them all in detail-
Architecture analysis is the best bet when you need to make early decisions that have that can bring a bright future and scope. After this, experts are able to employ throwaway prototypes to decide technology and predict feasibility.
Just Architecture analysis is insufficient for proving several critical system properties.
Architecture analysis complements vertical evolutionary type prototyping. Analysis assists you in selection of candidate technologies. With Prototyping, validation of those choices can be done and it also offers a working system in front of a stakeholder.
An evolutionary prototype can mitigate risk in most effective way when you implement it as an infrastructure within which components and technologies can be integrated. This helps mitigate integration risks, helps in establishing understanding for end-to-end scenarios, and supports early demonstrations of user value.
Vertical evolutionary prototyping can help find answers of questions related to system-wide properties. However, it might need to be augmented with throwaway prototypes in the presence of volatile requirements. The architecture analysis and throwaway prototype change with the change in requirements.
Throwaway prototypes are best to use when you have to do quick evaluation of technology to find and detect if it satisfies critical architecture scenarios. Experts usually use it for narrow scenarios that require evaluation of just one or two components.
Also Read: Key Considerations While Testing Big Data
Guidelines for Prototyping
It is important to keep it in mind that prototypes are expected to be fast and easy tests of design solutions. Here we bring some guidelines to assist you in the prototyping stage-
Just begin building
When you create a prototype of your idea, you get to know about it in a concrete manner. The prototype offers insights of your idea and you can improve it if there is a need.
Don’t spend excess time
Prototyping is about speed; which means if you spend more time on creating a prototype, you may get emotionally attached to the idea.
Remember what you are testing for
There should be a central testing issue for all prototypes. Don’t forget the issue and don’t ignore the other important lessons while testing the issue.
Build while considering the user
Make sure your tests for prototype should be based on expected user behaviour and user needs. This way you can find the gaps and enhance your ideas.