Ethereum-The Next Generation of Blockchain

Technology is truly, madly and absolutely unruly.

First, it whips up an appetite for an immortal idea and then, almost immediately threatens its extinction. Something similar is happening with Ethereum- Vitalik Buterin’s answer to Blockchain.

Ethereum – An interesting past

It all started in 2009 when an unknown entity called Satoshi Nakamoto created a cryptocurrency and an electronic payment system called Bitcoin and released it as an open-source software to enable peer-to-peer transactions.

Bitcoins were virtual value tokens, unaffiliated with any nation. So, anyone could open an online wallet and receive Bitcoins without providing any identifying information. The transactions got recorded on a universal ledger or database referred to as the Blockchain because, “all the activities were sorted into “blocks,” and each block was chained to the ones before it, all the way back to the very first transaction — a structure that made it tough for anyone to change the records .”


Truly the first-ever moment for business on the internet, Bitcoin offered a transparent digital medium for value access. Stocks, bonds, money, digital property, titles, deeds, even things like identity and votes could now be moved, stored and managed securely –with complete immutability, security and absolute privacy.

Predictably, Blockchain became the buzzword of choice with reactions ranging from intelligent and exploratory to warnings and total incomprehension.

Predictably, Bitcoin’s bumpy ride from laud to lament was equally quick. And like all good things tech, it was soon challenged by the Canadian math prodigy, coder, and crypto-currency researcher Vitalik Buterin.

And Ethereum- an application that used the Blockchain platform for delivering just a little more took center stage. And how!

Ethereum – A dynamic present

An open software platform, Ethereum was based on Blockchain technology but it now enabled developers to build and deploy decentralized applications as opposed to one new Blockchain for each new application.

The Ethereum Virtual Machine (EVM) was a Turing complete software that ran on the Ethereum network and enabled anyone to run any program, regardless of the programming language given enough time and memory.

The most exciting feature of Ethereum? It could create binding financial agreements entirely implemented using the software without human or legal intervention! And once started, the programs got running without interference – aptly named as “Smart Contracts”.

The Bitcoin equivalent here was Ether- the value token of the Ethereum Blockchain, listed as ETH and traded on the cryptocurrency exchanges. It was also used to pay for transaction fees and computational services on the Ethereum network and held in an Ethereum Wallet.


With Smart Contracts, one could now create a virtual decentralized autonomous organization. (And yes, it was created- DAO raised an impressive $150 million! From Ethereum enthusiasts).

It also opened up multiple app possibilities -censorship-free social networks, public utility ride-hailing apps, crowd-sourced prediction markets to online investment firms.

But that was not all. Ethereum offered significant advantages too.

  • Bitcoins six confirmations and 10 minutes processing was now replaced with Ethereum’s 12- 15 seconds.
  • Ethereum could scale up from 25 transactions per second to more than 1 million transactions per second, thanks to its proof-of-stake consensus algorithm.
  • Theft was impossible with a feature like Hashed TimeLock Contracts.
  • Tampering was impossible because the Apps were based on a network formed on consensus.
  • Data was secured using cryptography.



Ethereum- a bright future

Today Ether has a market value of $50.59 with a market capitalization of $4,469,204,016.   JPMorgan, IBM, Microsoft, and Intel have already initiated exploratory projects based on this technology.

Samsung and IBM recently launched a project to coordinate Internet-connected devices, like washing machines and light bulbs, over an Ethereum-based network.

And Banks like Well Fargo, Barclays and Credit Suisse have launched pilot programs using the secured contract option.


Early days, one would say for this virtual idea that’s about to usher in crypto-economic democracy and transparent governance. Not impossible though if one were to share Vitalik’s optimism -“Building this future is an enormous task, but one can’t have it any other way.”

ETHEREUM- The Indian Chapter

Ethereum -Access

The easiest way is to use its native Mist browser -a user-friendly interface & digital wallet for users to trade & store Ether as well as write, manage, deploy and use smart contracts.


There is also the MetaMask browser extension, which turns Google chrome into an Ethereum browser.


Business process management is now way easier!



Business Process Management provides a workflow framework that helps BIs and middle level management to start creating business workflows that eventually get executed as a process.

Workflow platforms use many components and are generally open source. One such product is Activiti.  Activiti is a light-weight workflow and Business Process Management (BPM) Platform targeted at business people, developers and system admins. Its core is a super-fast and rock-solid BPMN 2 process engine for Java. It’s open-source and distributed under the Apache license. Activiti runs in any Java application, on a server, on a cluster or in the cloud. One engaging use of Activiti is that is lowers the risk of potential failures and human interaction as compared to traditional ways.

Activiti is an Apache-licensed business process management (BPM) engine. Such an engine has a core goal to take a process definition comprised of human tasks and service calls and execute those in a certain order, while exposing various API’s to start, manage and query data about process instances for that definition. Activiti uses the BPMN 2.0 to make easier in the communication and understanding between Business Team and Developers is a added advantage in Activiti WorkFlow.

Activiti supports BPM2 (Business Process Management). BPM2 processes in Activiti are run in native Java. Activiti is a multi-component system with each component cut out for a particular role. They include:

1. Activiti Explorer:

Activiti Explorer is a web application using the Activiti API’s and showcasing the features of Activiti. Activiti contains a demo setup that will get this web app up and running in a matter of minutes. It usually runs in a Tomcat Server through deploying the Activiti war in the webapps folder on tomcat installation folder. Activiti explorer war is available from It consists of demo users and models and includes task management, process instance inspection, management features and viewing reports based on statistical history data.

2. Activiti Designer:

The Activiti Designer is an Eclipse plugin which allows you to create workflow/model BPMN 2.0 processes from within your IDE-environment. It also has built-in support for the Activiti-specific extensions to enable you to use the full potential of both the processes and the engine. It is used to create workflow using BPMN 2.0.

3. Activiti Modeler

The Activiti Modeler can be used to create workflow/model on BPMN 2.0 compliant processes graphically using a browser. The process files are stored by the server in a database model repository. Activiti-Explorer Web App consists the Activiti-Modeler In-Built to create the workflow/model.

4. Activiti Engine

It is the heart of the Activiti. It’s a Java process engine that runs BPMN 2 processes natively. It uses the Activiti API’s to process the BPMN 2 process. Activiti Engine is simply a jar which is used for the development of workflow using Activiti and exposes the functionalities to it.
In the next series, we will see how Activiti is used in a business scenario.


Written by Sandeep.

Sandeep is a Research Associate at Qruize Technologies specializing in Java Development.

Reuseable components speed up development time

With rising mobile devices on the go today, there’s an increasing pressure on developers to churn out applications daily. In an already mobile infested world, these applications have little time and priority for the average user. The rate at which apps are being churned out have made the mobile marketplace a volatile playground where only the fittest survive.

To help developers be on top of the race and yet create applications that speak of value, here are some components that can be re used for any project, which accelerates development time and go to market value. These modules are helpful to kick start any Android based project.


This module allows an app to log debug and error message in a giving details about line number, method name called, class name and of which package it is from. It appends log to user defined file with timestamp.


This module allows to make different network requests like GET, POST, PUT, DELETE, etc with url, optional parameters, optional headers, and it returns two listeners of response and error


This module allows an app to splash a screen or image for few seconds before loading the actual app.


This module allows to create different views dynamically on run time for eg.. it can create Imageview, Videoview etc.

Here are some screenshots which detail the component in use:

Login Screen


Splash Screen


Interactive Menu


Top 10 IT predictions for 2016


As we are coming to the close of yet another year; a year that saw rapid strides in technology across every sphere of life, here are ten exciting trends that will rule over 2016.

  1. The Device Mesh

This refers to the growing set of endpoints through which customers access, interact with clients, exchange data, store information. The device mesh includes everything from mobile phones to IOT based devices. These devices are connected to their back end networks and often work in isolation. This will change in due course giving more freedom and convergence to users.

  1. Ambient user experience

There’s no substitute for exemplary customer experience. The channel is oblivious. With much penetration of the device mesh, it gives developers finite control about how they want their customers to revel in the experience. Developers can forever alter the way customers think, feel and brand themselves in exciting new ways. With IOT heating up the personalization space, developers can now marry electronics and devices and data to form a formidable and consistent platform.

  1. 3D printing materials

3D printing is taking up in a big way. This means lower costs for production, unlimited customizations. Area of scope extends from aerospace, medicine, military, energy sector etc. As devices become smaller and scale to provide advanced functionalities, composite parts that can be easily manufactured, assembled, integrated is the order of the day. 3D printing just seems to be heading that way.

  1. Information of Everything

The device mesh produces data at every touchpoint. This is proliferated across all devices that form the mesh. The goal here is to make ‘sense’ out of the information goldmine. That is what Information of Everything tries to address. It seeks to link data from different sources and produce meaningful information from them.

  1. Advanced Machine Learning

This is an area of great interest. It envisages an environment where machines auto learn the environments they are in. DNNs (deep neural networks) enable hardware/software based machines learn the environment they are in. A precursor to this technology is already in use, in the form of auto-heal networks.

  1. Autonomous agents and things

Advanced machine learning gives rise to automatic robots that can function on their own. Typical examples are Google Now, Cortana, Siri – which use these frameworks to gauge the information received through the digital mesh and uses this information to process results. This is big talk as it directly impacts customer behaviour, personalization etc

  1. Adaptive security architecture

The complexities running a digital ecosystem exposes the threats and vulnerabilities affecting it. Simply relying on perimeter access and rule based security will not help in the future. The focus will shift to making applications safe at their layer itself. Enterprises also need to look at how user-entity behaviour to find out acceptable patterns and to weed out non acceptable or threat based risks.

  1. Advanced system architecture

Security is on everybody’s minds. With the accelerated rate of devices adoption, security is a critical layer that can’t be ignored. Using field programmable gate arrays, it is possible to build security systems that mimic the brain. Their light architecture helps them to be integrated into smaller form factors, with lower power consumption and greater efficiency.

  1. Mesh app and service architecture

With technology disruptions taking place on a very large scale, large, legacy monolith systems are giving way to smaller componentized systems. These systems are easier to manage, troubleshoot and maintain. Micro services play a key role developing agile systems that are deployed on a Cloud or mobile platform. Container technology also helps in faster rollout of software and micro services environments.

  1. IOT platforms

IOT platforms complement the digital mesh and its underlying device makeup. IOT platforms are what IT folks need to make IOT a reality. This basically boils down to managing, securing and integrating technologies and standards that power devices and data.

At Qruize, we pride ourselves on being at the helm of innovation. We’ve handled projects that touch upon each of these emerging trends in some way or the other. If you have an idea, we can build it for you – just hit that button already!

5 golden rules for easy IT…



“Press F1 for help”

This is a very well-known statement. But we are in a time where that help needs real help. With today’s hybrid environment, there are enterprises still running legacy equipment, there are third party vendors who still produce adaptors for integration, then there’s virtualization, and we’ve got Cloud.  Through it all,  the IT team still has to manage this load. It’s important to understand where the holes are plug them in before it falls. So here are some of those simple ideas which can be implemented quickly…

  1. Backup regularly and monitor them

The function of IT in an enterprise has largely been optimized with the Cloud framework. But are you operating on the right strategies for backup? It’s important not to just back up data, but to also think about metadata. That’s data about the data.

Metadata can help you out of serious issues by having them recorded or set in place somewhere. Ensure this simple fix in your policies the next time you backup. Backups don’t do anything to active files, so if colleagues have left for the day, ensure a policy closes all active files so that it’s backed u promptly!

  1. Maintain a cohesive team

Whatever be the process or goal, it’s people who drive it. It becomes paramount that people are comfortable in what they do and how they do it. If IT teams function as post boxes, their overall value diminishes. This concept now stretches even further with methodologies like DevOps etc where culture barriers need to be broken.

  1. Putting automation to work

IT management which has today become lot simpler through outsourcing still has a lot of points that are monitored daily. Questions also stem around what’s being monitored, is it useful, is it needed etc. Using automation this can be simplified and restructured.  Identifying ways and methods to automate and provide this across every touchpoint will reduce failures and increase productivity. This action has to happen across everything IT stands up for.

  1. Security concerns

Security is something that is quintessential all the time. It’s not doubt that enterprises are wary and find it hard to move to something virtual in a matter of time when they were used to physical boxes all along. Again, careful planning is the key to preventing an attack or being a victim. Enterprises should not leave this perimeter unchecked. Investing with the right security partner who safeguards your business motives will be the best bet.

  1. Teams need to break culture barriers

The latest thing to join the software bandwagon is DevOps. As the software industry progresses, newer models of software delivery crop up. DevOps is no stranger, but the processes and roadmap engineers have is a long drawn one. And, one that can’t change fast. This is experienced with most software companies as they try and grapple with DevOps, CI/CE etc. One efficient way is to break all barriers that hinder communication, interactions and meetings. Over time, the team will build up and come together for everything. Breaking down barriers gives everyone in the team instant connect with each other facilitating greater response and agility…the thing that is needed today.

Whether you are a startup organization or a development house, these steps will help you to craft better software experiences for your users.

Fodder for Friday


We only have a few weeks left until we welcome 2016, so with everyone already sitting back to enjoy the holidays, we thought we’d make it easier for the tech community to keep up with what’s important and happening by feeding them a little fodder every week.

These articles have been hand picked and contain a small gist revealing a glimpse of what the story is largely about. Well, yes, you can thank us generously – we always welcome chocolates, pies, cakes.. well you get the point!

So here goes:

Best Practices for Usability Testing (bitovi)

These practical, useful tips to weave usability testing within the design process should help agile teams build it and ship it, all while getting users to test it as well.

Stories of SaaS Success 

4 SaaS companies. 4 different people. 4 different approaches. Yet, 4 acts of brilliant successes.

Buffer is Perplexed with Social Media

This one’s ironical. After an honest admission of confusion as to why their social media sharing is declining, Buffer’s founder Kevan Lee got 390 comments and over 2000 shares for his article.

Key Lessons from Volkswagen

The VW issue has clearly pointed out the false sense of security under which we have been operating for a while when it comes to the networks, hardware and software that we rely on for IoT. Read on…

Towards complex software systems


The recent years have seen an exponential increase in software. For each and everything we can put our thoughts on, we have a software that helps or automates tasks. But as scope for development increases, so does the risks. Whatever happens, software complexity is on the increase, as there are too many components in orchestration with each other. The most important point is how to lower your risk and friction between components or modules.

It’s just not enough if you create software just to address a problem. Software truly grows when it outlives the purpose for why they were created. This effect has been felt in the case of the Web and the cloud environments. The hyperlink that was conceived by Tim in 1989 as a way to share information soon became the cornerstone for software creation all around the world. Suddenly you had software mushroom all over and information could be exchanged effortlessly. The same concept powers the Web. Individual nodes process information by themselves and are independent from others in the group. This is how software has scaled and this is how it should scale in the future. Software always scales by federation and wide spread adoption. Open source is an excellent movement that builds on wide spread adoption and use.  Software deployments have changed and become more complex in the last few years – thanks to collaboration fuelled by social media and the Internet. This opens a whole new chapter in developing new age software for the future.

There’s a friction component and a risk component involved in software. Friction happens because over time software modules get complicated. This normally slows things down. The risk factor is imminent due to various components coming together to form a massive system. Unless you check, check and recheck, you will always have a surprise.Though there are methods that address these problems, collaboration just changed the game all over again!, you have  multiple checkin’s, checkout’s builds, forks. This is where you have to be super critical about releases which lower both friction and risk.

Two interesting concepts to sorting this problem out One is micro services, the other is containers. Micro services are created by a small team who build, deploy and manage the service end-to-end. Micro services increase the risk factor but lowers friction. Since they are essentially self-contained modules, there is less friction, but coupled with risk. Containers on the other side reduce the risk, but can increase friction. Using containers, you build just once and the code can run anywhere any number of times as the environment is consistent. This helps better infra utilization as well.

Continuous delivery is that magic wand that we need to use to control both risk and friction when developing software. It’s like hitting 2 birds with a stone. We not only want to lower risk but decrease friction as well. Code changes so fast that end-to-end testing may not be the right way to contain delivery schedules. CD perfects our release mechanism every time we deliver. This takes out risk and friction out of the equation. We’ve always seen that large complex systems always give you a final moment of surprise when testing for releases. Because you can never be sure if everything will work as expected from the get go!

So in short, if stable releases are the order of the day, continuous delivery is your go-to platform.