Redesign Bulk Order Creation in Cross-Border Shipping

Type

Case Study

Company

Ninja Van

In this case study, I showcase how my team and I redesigned the bulk order creation for our B2B users that improved adoption rate and reduced support burden for internal staff.

Context

The Cross-Border project aims to support B2B clients who need to ship parcels across Southeast Asia. While we do not provide end-to-end logistics services ourselves, we collaborate with vendors such as customs brokers, shipping companies, airlines, and trucking providers. In this role, we position ourselves as a 4PL (Fourth-Party Logistics) partner, coordinating and optimizing the cross-border delivery through our network of service providers.

One of the key features we offered to our B2B clients is the ability to create orders directly in our system, enabling them to send parcel information for shipment processing.

To handle high-volume operations (often exceeding a thousand orders at a time) we provided a bulk order creation feature via file upload. Clients could prepare their orders in an Excel file and simply drag and drop it into our system.

Since each order contains numerous input fields, every data point must be accurate for the system to validate and process the shipment correctly.

However after 9 months launching this feature, we encountered significant resistance from our customers, which became the starting point for this improvement project.

My role
  • Conducted UX audit

  • Defined & led the design principles & direction.

  • Designed flows & interaction pattern.


The problems

5% of adoption

After 9 months of releasing this version, adoption remained extremely low—only 5% of bulk uploads were completed b end users. Most users depended on internal staff to upload orders on their behalf.

Users reported feeling confused and uncertain when using the feature.

The common feedback was that after uploading orders, users often encountered errors but were unsure what needed to be corrected. This led to frustration and made them hesitant or even afraid to try again on their own.

Evaluating impact: Why this problem mattered

Not scalable: If this pattern continued, the product would fail to deliver on its self-service value.

User support burden: Internal staff had to manually review, fix, and upload orders for the customer, consuming significant time and resources.

Frame the problem

How might we design an experience that guides users through resolving upload errors on their own, increasing confidence and reducing support dependency?

Design Point of View

We believe that when users abandon a feature that is essential to their workflow, it's often because they feel really uncertain in how to use it.

By applying UX practices such as discoverability, error prevention & & recovery, self-explanatory, we can really help user feel more confident doing the task on their own.

Users

B2B clients who need to create and manage large volumes of cross-border shipping orders through our system.

User Needs

  • Input order details accurately to match the actual shipment.

  • Rely on the system to validate and process orders correctly.

User Pains

  • High learning effort: new users need extra time to understand how the system works.

  • Vague and unclear system feedback, making it difficult to identify and resolve errors.

  • Bulk uploads are prone to mistakes, but fixing errors is slow and repetitive.

User Motivations

  • Accuracy first: mistakes directly impact customs clearance and shipment accuracy, which can lead to revenue loss.

  • Get things done faster: complete tasks quickly, as they are repetitive and tedious day to day work.

  • Error avoidance: minimize time spent correcting mistakes after submission


Design improvements

Improve interaction pattern: Aligning with user tasks

Before (Current)

After (Improved design)

The system offered options that didn’t match real user tasks.

  1. Manual input option: but users typically needed to create over 1000 orders at once.

  2. After uploading files, users were asked if they want to re-map columns, while they actually expected to start with the correct template.

Aligned with user tasks:

  1. Removed unnecessary options that are irrelevant.

  2. Focus on user task only (Upload & Check for accuracy)

Users expected to see errors before submitting their files:

  1. Instead, the system lacked upfront validation → valid and invalid recorded were created together.

  2. As a result, users had to go into batch details afterward to view and fix invalid records, breaking the expectation of finishing in one flow.

Users could review upload result before submission:

  1. Introduced step-by-step process within the upload flow.

  1. Users could now review upload results before submission.

  2. Errors were surfaced early, allowing users to fix issues upfront instead of after submission.

Invalid data was stored as “temporary” records.

  • Only valid records are created.


Before: Manual input and column mapping options were rarely useful for users


Before: Errors were only found after submission, forcing users to fix them later.
After: Validations are wrapped within the upload flow, so users can review and fix errors upfront.
Improved Error prevention & recovery

Before (Current)

After (Improved design)

Error feedback was delayed → User could only view errors long after submission.

Error feedback was now instant → Users could view results right after upload, fix errors, and replace the file within a few clicks.

Users had to check each order one by one to identify errors → forcing users to spend extra time investigating.

More holistic error review → For thousands of records, users could cross-check errors in an Excel-like interface side by side with their working file.

Error messages were inconsistent

  • No standard for how a message should appear.

Clear, consistent error messages & Actionable guidance

  • Consistency: keep phrasing uniform across all error states.

  • Actionable: explain what went wrong and what the user can do next.

  • Too many input fields → increased user effort and raises risk of errors.

  • Irrelevant errors shown (e.g., account setup issues) shown as raw API/system codes → confused the user and block their main task.

Redesign technical solution & business flow to provide better UX, such as:

  • Reducing the number of input fields.

  • Eliminating API errors from the user experience.

How error feedback is shown: Before vs After.
Before: Users could only view errors one order at a time.
After: Users could review all errors on a single page.
After: Error feedback was instant, with a more holistic error review
Before: Error messages had no standard structure.
After: Error messages followed a consistent format.
After: Users could easily switch between the error view and their Excel file, making cross-checking and fixing errors more efficient.
Improve system self-explanatory

These changes also make the system more self-explanatory by improving:

Clarity — Simplified tasks with clear, linear steps.

Familiarity — Error review mirrors spreadsheet (Excel-like) workflows; matches user expectations to finish a task in one flow; inline guidance (tooltips, labels, inline error states).

Feedback — Step-by-step responses and expanded error visibility so users always know where they are and what to do next.

For a clearer view, I’ve prepared the full screen flow mapping, which can be viewed on this Miro board.


Testing: A/B test & result

Methodology 

We conducted an A/B test to evaluate 2 versions of the order creation process.

  • Control version (A), users frequently made errors.

  • Improved version (B): was designed to improve the task and reduce user mistakes.

2 were running at the same time with continuous data.

Hypothesis: New order creation version requires less effort to create valid order.

Conversion metric:

  • Number of attempt to fix order

  • Time spent to create order

  • Conversion rate

Split %

  • 50% existing user use variant A

  • 50% existing user use variant B

  • 50% new users will be split between variant A and B

Sample size: 75 users 

Statistical method: Chi-square test to compare proportions of success 

Unit of analysis: One user account can have multiple users logging in to do the work so we’ll use unique session as the unit of analysis to calculate the conversion rate.


Result

Variant B significantly reduced errors, only 3.9% of orders contained errors compared to 27% in Variant A.

Variant B has only 3.9% of orders contained errors compared to 27% in Variant A


Variant B showed a significant improvement in step-by-step conversion rates.


Conversion rate


Most importantly, variant B allowed users to complete the order creation process in less than 10 minutes, whereas some shippers using variant A took more than 30 minutes to complete the task.


Time spent to create order


Impact of the improvement

Following the positive A/B test results, we rolled out the feature to all users. Within three months, adoption rose to around 90%.

The redesigned bulk order creation flow led to measurable improvements in both user experience and operational efficiency:

  • Higher adoption: More users successfully completed orders without requiring internal support.

  • Stronger conversion rates: Shippers completed tasks faster, increasing overall task completion.

  • Reduced support burden: Sales and support teams handled fewer manual interventions, saving time and resources.


Reflection

This project is one of the most challenging. The main challenge wasn’t just about the design itself, but about collaborating efficiently with the tech team.

Improving the technical solution meant more than leaner code. In our case, it was what allowed us to deliver more human experience for our users.

On paper, the design might look simple. But achieving the kind of simplicity that users were going to need required months of work behind the scenes.

One of the hardest problems was handling large volumes of data in a single batch. This required me, as a designer, to go beyond surface-level UX practices and gain a deeper understanding of the underlying technology.

At the end of the day, UX in technology is about helping people interact with machines efficiently. I spent weeks digging into the technical details so that I could facilitate conversations with tech team on how we should rethink our data-handling approach. That work was essential for me because it enabled me to deliver interactions that felt as simple and intuitive as users expected.

Of course, there was resistance. Even within our tech team, it was not easy to suggest changes, especially after they had already spent 3 months implementing the feature in the first place. The team agreed to make a change only when the burden of customer support became severely high as our customer base grew.

The project reinforced that clear communication through UX design is critical, especially in complex workflows where users heavily rely on the interface for feedback and guidance.

When we truly prioritize clear communication in the design itself, we reduce confusion, improve recovery, and enable users to build confidence and mastery with the system.