Taha Ashtiani bio photo

Taha Ashtiani

Searching for EV in the tails.

Email LinkedIn Github

I started working on the project during the summer of 2019, helping an Australian startup with their mvp. The idea is relatively simple, but it gets tricky and complicated when it comes to handling details. There are many well established E-commerce brands in north America, Europe or Australia who would love to access their audience in Southeast asian markets and sell their products with a premium.

Now think about selling on amazon marketplace (with FBA model for example). Every single aspect of It has been figured out, courtesy of all the tools and logistics startups in that space. But when it comes to selling on a less common foreign marketplace, things are different. It is a very complicated task to be handled in-house by a brand. And that is the opportunity for a startup to come in and take care of all the integrations, logistics and shipment, as a business partner of those brands.

In this post I want to go through the overall workflow of the app and the challenges in the process of design and implementation. I hope it’d be useful.

How I started working on the problem

The product and integrations work in 4 different layers. The brands’ ecommerce CMS, marketplaces, shipping company, and the management dashboard. Here is an overall view of how these pieces come together:


All the backend functions and utils belong to these four categories and modules:

  • Product management

  • Orders and shipping management

  • Inventory management

  • Pricing management

These functions are spread out across 5 main integrations with brands, marketplaces, and the shipping company. Let’s go through the stack of the software and the integrations in detail.

The Stack


The entire backend is written in Django and Python3. I used ubuntu for the servers on GCP. There is Gunicorn as a wsgi for serving web traffic with NGINX as a reverse proxy in front of Gunicorn processes. I chose to use django with sqllite. Concurrency was not an issue, and it handled lots of migrations along the way pretty well. But looking back I think PostgreSQL might have been a better choice for staying compatible with future migrations. Celery Beat (with Redis) is used for scheduling tasks. and Postman for testing endpoints.


The frontend app is built with React + Webpack. Nothing’s rendered on the server side. The webapp is a single page application and relies fully on the Rest API. When deploying to production, a bash script builds the JS bundles which are then collected in django’s static root and get served.


Woocommerce and Magento on the brands side

On the brand CMS layer, the system is integrated with woocommerce and magento. Both very well documented and fairly easy to work with. The integrations help us manage products, orders, prices and inventories.

The products are automatically imported using our integration, or using CSV import. Finally there are some cleaning, translations and mapping happening in the backend, before the products are imported in our product table. Here is a view of the main product management tools on the webapp:




jd.id is the Indonasian subsidiary of jd.com ( known as Jingdong), the huge chinese retailler. along with tmall.com these are the gaigantic online marketplaces on the planet earth :)

They have a terribly unstable rest api with unclear documentations and response formats. I had to make it work and I did, but it took a little bit of reverse engineering and getting over lots of weird error codes and unstable responses.

take a look at their docs under the “seller to jd” section : https://api.jd.id/docCenter/docList


zilingo is a fairly smaller marketplace with most of their visitors from Thailand, India, Philippines, Singapore …

Unlike jd.id, they have nice and useful docs for their endpoints. But still not an easy workflow for submitting and activating products on the platform.

Here is a link to their docs: https://api.sellers.zilingo.com/assets/docs/ZilingoSellerApiDoc-v1.4.6.pdf

Shipping company


Interesting challenges

Here I’ll go through some of the real challenges and tricky parts of an ecommerce integration project like this. There are so many details that can get complicated to architect and implement. But with the right tools and a little bit of engineering everything’s possible.

1. Pipelines and tagging everything with a state

Let’s start with the problem of having multiple sources of truth and a pipeline of orders and products moving around. In between sellers, admins, and marketplaces. back and forth.

Think about an order. Right after it is created by the end customer on a marketplace, It moves through a pipeline with a changing state. It starts with NEW, which could change to EDITED, AWAITING_CONFIRMATION,APPROVED, SUCCESSFULLY_PROCESSED, CANCELLED, QUEUE,SYNCED.


Now think about how all 5 main integrations(marketplaces, shipping company, and sellers’ CMS) and also our management core itself, have different types of states, and also different terminologies for it to begin with. It should all get translated using a tagging system and come together in our DB. which is the entry point of the pipeline.

I started by thinking about all different scenarios of possible state changes. considering fallbacks, and for example thinking about “what should happen if an order was CANCELLED in a marketplace ,right after we have sent it to the seller?”.

Thinking about the state management on a piece of paper was the most fundamental part of the architecture. If you mess this part up, it will easily break everything else you build on top of it.

2. Aggregating orders

A general scenario for an order happens like this:

An order is created by an end customer on let’s say Jd.id . We fetch the orders from them on a regular basis and store it in the orders table. from there it has to be edited, audited and approved. The order is then transformed and finally pushed to the sellers CMS (magento, woocommerce).

But that’s not all. we can’t just pass on single orders, one by one and every time we get one. There are logistical restrictions which are hard coded, and aggregate the orders in order to minimize the freight cost charged by the shipping company.

The logic for order notifications emails is simpler. I query orders with specific tags and bundle them into one email. Then I build the printable shipping labels, for the orders in the bundle, using the order details. The emails include the orders’ details and printable shipping labels which are sent out to the brands twice a day.

3. Keeping prices and inventories updated

There are multiple sources of truth for prices and inventories. For example for the prices, we have the prices in a specific currency on the brands website, prices in our management core, and finally different prices which are pushed to the marketplaces. The final prices and level of inventories are edited frequently in response to the sales of each products. I had to come up with a way for keeping the prices and the inventories on the marketplaces in sync with our core. I decided Django signals is the right tool for the job and is also pretty simple to implement.

For example here is how the inventories are updated:

The level of inventories are preiodically retrieved (using /index.php/rest/V1/stockItems/ for magento and stock_quantity for woocommerce) and saved to the inventories table. Which is then signaled to our tables for the matketplaces. And from there it can be edited by our admins and finally get synced with the live inventories on the marketplaces. The same goes for the prices.

@receiver(post_save, sender=Price)
def update_prices(sender, instance, created, **kwargs):
    if not created:
        price = instance
        JDProduct.objects.filter(product_id = price.product_id).update(jd_price=price.jd_price,cost_price= price.jd_cost_price )
        ZilingoProduct.objects.filter(product_id = price.product_id).update(zilingo_price=price.zilingo_price )


4. Handling the edits from webapp tables

There is a lot of manual editing done by the admins. On the webapp all of our tables are editable react tables. Right after any changes are made to the tables, the data is updated in the react state. There might be a few back and forth editing and finally when the admin is happy with the changes, she pushes a button to save the changes.

Now on the backend I need to decide what to do with the updated data. First I have to recognize what rows have been changed. I do that by comparing the updated data with our current data in DB.

After finding the changes in that specific table, I still need to decide how to change the state of those changed products or orders.

let’s say it’s a product. If it is in a NEW state, it has to be updated to EDITED(waiting for the admin to approve the changes). If it is already live on one of the marketplaces, it should change from LIVE to UPDATE(again waiting for the admin to approve the update). And if it was already in the EDIT state, the last_edit_datetime gets updated.

I hope you find the report useful, especially if you’re in the middle of building something similar. I obviously cannot get into more technical details, but if you have any general questions about the overall architecture or the integrations, feel free to send me a message.