During summer of 2019 I wrote this MVP for an Australian seed stage startup. An e2e E-commerce integration platform to connect indvidual stores (Shopify, Woocommerce, Magento) to Southeast Asian marketplace (jd.com, zilingo.com). There are many well established E-commerce brands in north America, Europe or Australia who would like to access their audience in Southeast asian markets and sell their products there. This essentially needs to be an e2e integration between stores and marketplaces. Which is too complex to be handled in-house by most brand.
In this post I go through the high-level design of the platform and some challenges in the process of implementation.
How I started working on the problem
The product and integrations work in 4 different layers. The brands’ ecommerce CMS, marketplaces, shipping company, and the management dashboard. Here is an overall view of how these pieces come together:
All the backend functions and utils belong to these four categories and modules:
Orders and shipping management
These functions are spread out across 5 main integrations with brands, marketplaces, and the shipping company. Let’s go through the stack of the software and the integrations in detail.
The entire backend is written in Django and Python3. I used ubuntu for the servers on GCP. There is Gunicorn as a wsgi for serving web traffic with NGINX as a reverse proxy in front of Gunicorn processes. I chose to use django with sqllite. Concurrency was not an issue, and it handled lots of migrations along the way pretty well. But looking back I think PostgreSQL might have been a better choice for staying compatible with future migrations. Celery Beat (with Redis) is used for scheduling tasks. and Postman for testing endpoints.
The frontend app is built with React + Webpack. Nothing’s rendered on the server side. The webapp is a single page application and relies fully on the Rest API. When deploying to production, a bash script builds the JS bundles which are then collected in django’s static root and get served.
Woocommerce and Magento on the brands side
On the brand CMS layer, the system is integrated with woocommerce and magento. Both very well documented and fairly easy to work with. The integrations help us manage products, orders, prices and inventories.
The products are automatically imported using our integration, or using CSV import. Finally there are some cleaning, translations and mapping happening in the backend, before the products are imported in our product table. Here is a view of the main product management tools on the webapp:
They have a terribly unstable rest api with unclear documentations and response formats. I had to make it work and I did, but it took a little bit of reverse engineering and getting over lots of weird error codes and unstable responses.
take a look at their docs under the “seller to jd” section : https://api.jd.id/docCenter/docList
zilingo is a fairly smaller marketplace with most of their visitors from Thailand, India, Philippines, Singapore …
Unlike jd.id, they have nice and useful docs for their endpoints. But still not an easy workflow for submitting and activating products on the platform.
Here is a link to their docs: https://api.sellers.zilingo.com/assets/docs/ZilingoSellerApiDoc-v1.4.6.pdf
Here I’ll go through some of the more tricky parts of an ecommerce integration project like this. There are so many details that add to the complexity.
1. Pipelines and tagging everything with a state
Let’s start with the problem of having multiple sources of truth and a pipeline of orders and products moving around. In between sellers, admins, and marketplaces. back and forth.
Think about an order. Right after it is created by the end customer on a marketplace, It moves through a pipeline with a changing state. It starts with
NEW, which could change to
The same for a product:
SYNCED and …
Now think about how all 5 main integrations(marketplaces, shipping company, and sellers’ CMS) and also our management core itself, have different types of states, and also different terminologies for it to begin with. It should all get translated using a tagging system and come together in our DB. which is the entry point of the pipeline.
I started by thinking about all different scenarios of possible state changes. considering fallbacks, and for example thinking about “what should happen if an order was
CANCELLED in a marketplace ,right after we have sent it to the seller?”.
Thinking about the state management on a piece of paper was the most fundamental part of the architecture. If you mess this part up, it will easily break everything else you build on top of it.
2. Aggregating orders
A general scenario for an order happens like this:
An order is created by an end customer on let’s say Jd.id . We fetch the orders from them on a regular basis and store it in the orders table. from there it has to be edited, audited and approved. The order is then transformed and finally pushed to the sellers CMS (magento, woocommerce).
But that’s not all. we can’t just pass on single orders, one by one and every time we get one. There are logistical restrictions which are hard coded, and aggregate the orders in order to minimize the freight cost charged by the shipping company.
The logic for order notifications emails is simpler. I query orders with specific tags and bundle them into one email. Then I build the printable shipping labels, for the orders in the bundle, using the order details. The emails include the orders’ details and printable shipping labels which are sent out to the brands twice a day.
3. Keeping prices and inventories updated
There are multiple sources of truth for prices and inventories. For example for the prices, we have the prices in a specific currency on the brands website, prices in our management core, and finally different prices which are pushed to the marketplaces. The final prices and level of inventories are edited frequently in response to the sales of each products. I had to come up with a way for keeping the prices and the inventories on the marketplaces in sync with our core. I decided Django signals is the right tool for the job and is also pretty simple to implement.
For example here is how the inventories are updated:
The level of inventories are preiodically retrieved (using
/index.php/rest/V1/stockItems/ for magento and
stock_quantity for woocommerce) and saved to the inventories table. Which is then signaled to our tables for the matketplaces. And from there it can be edited by our admins and finally get synced with the live inventories on the marketplaces. The same goes for the prices.
@receiver(post_save, sender=Price) def update_prices(sender, instance, created, **kwargs): .... if not created: price = instance JDProduct.objects.filter(product_id = price.product_id).update(jd_price=price.jd_price,cost_price= price.jd_cost_price ) ZilingoProduct.objects.filter(product_id = price.product_id).update(zilingo_price=price.zilingo_price ) ....
4. Handling the edits from webapp tables
There is a lot of manual editing done by the admins. On the webapp all of our tables are editable react tables. Right after any changes are made to the tables, the data is updated in the react state. There might be a few back and forth editing and finally when the admin is happy with the changes, she pushes a button to save the changes.
Now on the backend I need to decide what to do with the updated data. First I have to recognize what rows have been changed. I do that by comparing the updated data with our current data in DB.
After finding the changes in that specific table, I still need to decide how to change the state of those changed products or orders.
let’s say it’s a product. If it is in a
NEW state, it has to be updated to
EDITED(waiting for the admin to approve the changes). If it is already live on one of the marketplaces, it should change from
UPDATE(again waiting for the admin to approve the update). And if it was already in the
EDIT state, the
last_edit_datetime gets updated.
I hope you find the report useful, especially if you’re in the middle of building something similar. I obviously cannot get into more technical details, but if you have any general questions about the overall architecture or the integrations, feel free to send me a message.