Revolutionary analytics driven platform to present the buyers with best possible buying opportunities delivering ultimate shopping experience
Mandatum is developed to aid the online shopping experience for buyers by planning their purchase decision in advance with the help of deep learning and analytics based on their shopping list. Application helps users to plan and purchase the products by suggesting the right stores to buy the products at the right time and at the right price.
VSH leveraged its in house developed Crawling Platform for crawling requirements and developed system to update and store product data after every 24 hours to verify it with the predefined attributes. In order to provide the price predictions for particular products, VSH developed an Machine Learning algorithm to analyse and predict the specific product prices
for next Seven, Fifteen and Thirty days. It streamlined price predictions, price drop alerts, appropriate product purchase time and coupons mechanism.
Project engagement included the development of:
Mandatum management team has vast experience in data science and analytics and they wanted to leverage their expertise by developing an analytics driven platform to transform the online shopping experience of users. The project scope also included development of prediction models to optimize the large purchase decisions of corporates and large enterprises and several other long term objectives. Mandatum management was exploring possible technology vendors to execute this convoluted solution; a team having experience in machine learning, crawling, online marketplace and infrastructure management capabilities. After several evaluation interviews and discussion around development approach with the VSH technical team, VSH was selected as a project partner for Mandatum.
VSH Solutions exercised its development platforms to accelerate the development and optimize the development costs.
In the initial phase, the product range and data crawling was limited to Amazon store which later on extended to other stores and online platforms.
Data crawling was a crucial functionality for the application and it required voluminous crawling at very rapid pace. In the beginning VSH utilized the Keepa services to crawl stale data and used its own solution for regular updates of the same which later on modified using the in house developed platforms and ensured high accuracy and availability of the crawlers to render factual data for analytics.
More than 30 prediction models were conceptualized and developed to facilitate the price predictions, category wise product correlations, price fluctuations in anticipation of new product launch, price trends across various product categories and price fluctuation probability in coming days, weeks and months. MongoDB was used to store the crawled variable data and run the machine learning algorithms to assist the users in the buying process.
Backend system was developed to share the product data and preferred prices of those products with the vendors, who were given the opportunity to fulfil requirements which were feasible to them. Further, accessible bidding mechanism was implemented to sort out the claims of multiple interested vendors; this practical system increased the conversion rates and gave rise to mutually beneficial arrangement between buyers and sellers.
VSH took the responsibility to improve the product acceptance level and provide required visibility with tailored marketing activities. In order to minimize the support time for the application and help expedition of training and on boarding of resources, VSH developed comprehensive training program backed by standardized documentation and explanatory videos of the product.
2 Web Developers
Crawling of millions of product’s data, its assessment and storing, comparing it on regular basis for buying suggestions was humongous data management exercise. But accurate crawling results were critical for the functioning of the application as the core functionalities of the application were totally dependent on its results. VSH modified the Crawler Platform to meet these high end requirements and established a practise to check the health of the crawlers at a regular interval.
Finding out product price patterns, correlations between various parameters and implementation of the multiple machine learning models was complicated. The issue was ironed out by developing set of models to provide the best possible results to user’s buying decision.
During the final stages of the development, there was major change in the client’s organization structure resulting into alterations in the project stakeholder positions. VSH Account Owner assisted the new CEO and stakeholders by providing the technology briefing and facilitated the on-boarding of the new team.
Methodologies and optimal practises to handle data at a sizable scale
Approach and innovative thinking necessary to find out the data analytics patterns and their implement techniques
Find out more about how we can help you
to add value to your business