cd ../blogs
blog25 min read

Building the Food Automation Engine: A Technical Deep-Dive

The complete implementation guide — from reverse-engineering the API to automating 1,20,000+ food orders for ~700 users at FF21.

#automation#reverse-engineering#api#node.js#express#prisma#backend#system-design

If you haven't read Part 1 yet — From Frustration to Automation: Automating Food Orders for ~700 Users — go read that first. It covers the backstory, the pain point, and the journey from a personal script to a public web app serving hundreds of residents at FF21.

This post is the technical companion. No fluff, no backstory — just code, architecture decisions, and the lessons I learned building a system that placed 1,20,000+ food orders autonomously over 8 months. Let's rip the engine apart.


Reverse-Engineering the API

The FF21 food ordering app doesn't have a public API. There's no documentation, no developer portal — just a mobile app built on a WebView wrapping a PHP backend.

To automate anything, I first had to figure out what the app was actually doing under the hood.

The Proxy Setup

I used a rooted Android emulator paired with Burp Suite as an intercepting proxy. Root access was required to bypass SSL pinning — without it, the app refuses to trust any custom certificate authority, so HTTPS traffic is invisible to the proxy.

If you want to learn this technique, these two videos walk through the process in detail:

With the proxy capturing every request, I opened the FF21 app, logged in, browsed the food menu, and placed a test order — all while watching the raw HTTP traffic flow through Burp Suite.

What I Discovered

The app uses two separate backend systems:

  1. api.getmilo.app — The community/authentication layer (Milo platform). Handles OTP login, session management, and feed tiles. Uses a REST API with okhttp/4.2.1 as the User-Agent.

  2. www.ff21.co.in — The actual food ordering system. A PHP-based web app that renders HTML pages inside the mobile app's WebView. This is where orders are placed.

The key insight was that the mobile app is essentially a web browser in disguise. The "native" food ordering screen is just an HTML page loaded via a URL with an authTokken query parameter (yes, that's how they spelled it — double k).

Key Endpoints

Here's what the traffic analysis revealed:

EndpointMethodPurpose
api.getmilo.app/send_otp_to_phone_number/POSTSend OTP to user's phone
api.getmilo.app/verify_phone_number_otp/POSTVerify OTP, receive access token
api.getmilo.app/signup_with_token/POSTExchange access token for session ID
api.getmilo.app/get_feed_tiles/GETFetch app dashboard data (contains auth token)
www.ff21.co.in/app_login.phpGETEstablish PHP session from auth token
www.ff21.co.in/user_app/get_qr_service.phpGET/POSTView menu / place order
www.ff21.co.in/user_app/book_your_meal_tab.phpGETFetch meal pack configuration

Interesting Findings

The Milo API requires custom headers that mimic the actual Android app — without them, requests are rejected:

const headers = {
  "Client-Version": "423",
  "Client-Type": "android",
  "Client-Package": "com.ff21.community",
  "Community-Code": "FF21",
  "Community-Id": "707",
  "Language-Code": "en",
  "User-Agent": "okhttp/4.2.1",
};

Explanation:

  • Client-Version: 423 is the APK version number — the server validates this. If it's too old, the API rejects the request.
  • Client-Package identifies the specific white-label community app (FF21 is built on the Milo platform shared by multiple communities).
  • Community-Code and Community-Id tell the backend which community's data to return.
  • The User-Agent must be okhttp/4.2.1 — the exact HTTP library the Android app uses internally. Sending a browser User-Agent gets blocked.

The food ordering backend (ff21.co.in) is a different story entirely. It expects browser-like headers since the WebView renders HTML:

const headers = {
  "User-Agent": "Mozilla/5.0 (Linux; Android 16; ...) AppleWebKit/537.36 ... Mobile Safari/537.36",
  "X-Requested-With": "com.ff21.community",
  "Sec-Ch-Ua-Platform": '"Android"',
  "Sec-Fetch-Site": "same-origin",
  "Content-Type": "application/x-www-form-urlencoded",
};

Explanation:

  • The X-Requested-With header is critical — it identifies traffic as coming from within the Android app's WebView, not a standalone browser.
  • The Sec-Ch-Ua-Platform and Sec-Fetch-* headers replicate what a Chromium-based WebView actually sends.
  • The PHP backend serves HTML pages and accepts form-encoded POST data, not JSON — it's a traditional web form submission disguised as an API.

Authentication & Session Management

Authentication is the trickiest part of the whole system. The user goes through a five-step chain just to get from a phone number to a valid food-ordering session. Here's the full flow:

Phone Number → OTP → Access Token → Session ID → Auth Token → PHP Session (PHPSESSID)

Step 1: Send OTP

When a user registers on our web app, we forward their phone number to the Milo API. This is handled in controller/send_otp.js:

const sendOtp = async (req, res) => {
  const phone_number = reqBody ? reqBody.phone_number : null;
 
  if (!/^\d{10}$/.test(phone_number)) {
    return res.status(400).json({
      message: "Please provide a valid 10-digit phone number.",
    });
  }
 
  const url = "https://api.getmilo.app/send_otp_to_phone_number/";
 
  const body = new URLSearchParams({
    phone_number: phone_number,
    country_code: "+91",
  }).toString();
 
  const apiRes = await fetch(url, { method: "POST", headers, body });
  const data = await apiRes.json();
 
  if (data.status != "PASS") {
    return res.status(403).json({
      message: "Phone number is not registered with FF21!",
    });
  }
 
  return res.status(200).json({ message: "Successfully sent OTP." });
};

Explanation:

  • We validate the phone number with a strict 10-digit regex before making any external call — fail fast.
  • The country code is hardcoded to +91 (India) since all FF21 residents are in India.
  • If data.status != "PASS", the phone number isn't registered on FF21. This acts as an implicit authorization gate — only existing FF21 residents can use our service.

Step 2: Verify OTP and Chain All the Tokens

This is where the magic happens. Once the user provides their OTP, we verify it and then cascade through every authentication layer in a single request. The entire chain lives in controller/veiry_otp.js:

const verifyOtp = async (req, res) => {
  // ... validation ...
 
  const url = "https://api.getmilo.app/verify_phone_number_otp/";
  const body = new URLSearchParams({
    phone_number: phone_number,
    otp: otp,
  }).toString();
 
  const apiRes = await fetch(url, { method: "POST", headers, body });
  const data = await apiRes.json();
 
  if (data.status !== "PASS") {
    return res.status(401).json({ message: "Invalid OTP!" });
  }
 
  const userData = await getSession(data.access_token);
  const authToken = await getAuthToken(userData.sessionId);
  const phpSessId = await getPhpSessId(authToken);
  const guestId = await getGuestId(authToken, phpSessId);
  const packInfo = await getPackInfo(guestId, phpSessId);
  // ...
};

Explanation:

  • The OTP verification returns an access_token. But that alone isn't enough — it's only step one of a five-layer auth chain.
  • getSession() exchanges the access token for a session ID (a server-side session cookie from the Milo platform).
  • getAuthToken() uses that session to fetch the app's dashboard, which contains an embedded auth token — a base64-encoded string hidden inside a URL in the feed tiles JSON.
  • getPhpSessId() uses the auth token to establish a PHP session on the ff21.co.in food ordering server.
  • getGuestId() and getPackInfo() scrape the resulting HTML pages to extract the user's guest ID and meal pack configuration.

This entire waterfall runs once per user at login time, and every piece of extracted data gets persisted so we never re-run it.

Device Fingerprint Spoofing

The getSession() function in controller/get_session.js does something interesting — it generates a random device fingerprint for each login attempt:

const deviceList = [
  { manufacturer: "Samsung", model: "SM-G998B", device: "o1s", product: "o1sxxx" },
  { manufacturer: "Google", model: "Pixel 7", device: "panther", product: "panther" },
  { manufacturer: "OnePlus", model: "CPH2449", device: "OP594DL1", product: "CPH2449" },
  { manufacturer: "Xiaomi", model: "M2102J20SG", device: "alioth", product: "alioth_global" },
  // ... 6 more devices
];
 
function generateDeviceId() {
  return crypto.randomBytes(8).toString("hex");
}
 
function getRandomDevice() {
  return deviceList[Math.floor(Math.random() * deviceList.length)];
}

Explanation:

  • The Milo API expects device information as part of the signup/login flow — manufacturer, model number, device codename, and a unique device ID.
  • Since all requests originate from our server (not from actual Android phones), we maintain a pool of 10 real Android device profiles and pick one at random each time.
  • The deviceId is a cryptographically random 16-character hex string generated via crypto.randomBytes(8).
  • This prevents the backend from flagging all requests as coming from the same "device" — which could trigger rate limiting or account lockouts.

The device info gets packaged into a debug string that mimics the actual format the app sends:

function generateDeviceInfo(device, apkVersion = 423) {
  const osVersions = [
    "6.6.66-android15-8-gb66429556fb8-ab13070261(13344233)",
    "5.15.137-android14-8-g3c5fe2c4c5ab-ab10284149(12345678)",
    // ... more OS versions
  ];
 
  return `Debug-infos:\n Communities APK Version: ${apkVersion}\n OS Version: ${randomOsVersion}\n OS API Level: ${randomApiLevel}\n Manufacturer: ${device.manufacturer}\n Device: ${device.device}\n Model (and Product): ${device.model} (${device.product})\n MAC: `;
}

Explanation:

  • This string is sent as the device field in the form body. It's a multi-line debug dump that the real app generates from android.os.Build system properties.
  • The randomized OS version and API level add another layer of fingerprint diversity.
  • The MAC field is always empty — the real app also sends it blank (probably a privacy measure).

Persisting Auth Data

After the entire chain completes, all extracted credentials and pack info are stored in PostgreSQL via Prisma:

await prisma.user.upsert({
  where: { phone: phone_number },
  update: {
    ...dataToStore,
    updatedAt: new Date(),
  },
  create: {
    phone: phone_number,
    ...dataToStore,
  },
});

Explanation:

  • upsert ensures that if a user re-authenticates (e.g., their session expired), their existing record is updated rather than duplicated.
  • The stored data includes: authToken, sessionId, guestId, and both breakfast and dinner pack configurations (brfastPackId, dinnerPackId, etc.).
  • Our web app then issues its own JWT (7-day expiry) so subsequent requests don't require re-authentication through the entire chain.

JWT-Based Session for Our Web App

We separate the FF21 auth (for placing orders against their backend) from our own app's auth. Users interact with our API using a JWT we issue:

const accessToken = jwt.sign(
  { phone: phone_number, name: userData.userName, avatar: userData.avatar },
  process.env.JWT_SECRET,
  { expiresIn: "7d" }
);

Explanation:

  • The JWT payload contains the user's phone number (used as the primary key), display name, and avatar URL.
  • 7d expiry keeps users logged in for a week — long enough to be convenient, short enough that stale sessions don't pile up.
  • Every subsequent API call to our /order/* routes passes through the verifyUser middleware, which validates this JWT and attaches req.user_phone.

The middleware in middleware/verifyUser.js:

const verifyUser = (req, res, next) => {
  const authHeader = req.headers.authorization || "";
  const accessToken = authHeader.startsWith("Bearer ")
    ? authHeader.slice(7)
    : authHeader;
 
  if (!accessToken || typeof accessToken !== "string") {
    return res.status(401).json({ error: "Unauthorized! Missing Access Token!" });
  }
 
  try {
    const user = jwt.verify(accessToken, process.env.JWT_SECRET);
    req.user_phone = user.phone;
    return next();
  } catch (e) {
    return res.status(403).json({ error: "Invalid Access Token!" });
  }
};

Explanation:

  • Standard Bearer token extraction from the Authorization header.
  • jwt.verify() handles both signature validation and expiry checks. If either fail, the user gets a 403.
  • req.user_phone is set for downstream route handlers — this is how we know which user's orders to fetch or create.

Core Ordering Logic

The heart of the entire system is orderFood.js — a standalone script that runs as a cron job. It fetches all pending orders from the database and executes them one by one against the FF21 API.

The Main Loop

async function orderForTomorrow() {
  await ensureLogDir();
 
  const now = new Date();
  const istFormatter = new Intl.DateTimeFormat("en-CA", {
    timeZone: "Asia/Kolkata",
    year: "numeric",
    month: "2-digit",
    day: "2-digit",
  });
  now.setDate(now.getDate() + 1);
  const istDateStr = istFormatter.format(now);
 
  let orders;
  try {
    orders = await prisma.order.findMany({
      where: {
        orderDate: `${istDateStr}T00:00:00.00Z`,
      },
      include: {
        user: {
          select: {
            brfastPackId: true, brNonvegType: true,
            brfastQrType: true, brfastQrTypeId: true,
            dinnerPackId: true, diNonvegType: true,
            dinnerQrType: true, dinnerQrTypeId: true,
          },
        },
      },
    });
  } catch (err) {
    await logError(`Failed to fetch orders for ${istDateStr}: ${err.message}`);
    return;
  }
  // ...
}

Explanation:

  • Intl.DateTimeFormat with en-CA locale gives us the date in YYYY-MM-DD format, explicitly in the Asia/Kolkata timezone. This is crucial — the server runs in UTC but FF21 operates on IST. Using en-CA is a neat trick to get an ISO-like date string without manual formatting.
  • now.setDate(now.getDate() + 1) calculates tomorrow's date since the script runs the evening before.
  • The Prisma query fetches all orders for that date and joins the user table to grab their meal pack configuration — breakfast and dinner pack IDs, veg/non-veg preferences, and QR code types.
  • If the database query itself fails, the function logs the error and exits immediately — no point trying to process zero orders.

Processing Each Order

For every order, the script needs to:

  1. Get a fresh PHP session
  2. Build the correct form payload (breakfast vs. dinner)
  3. Submit the order to the FF21 server
  4. Clean up the order from our database
for (const order of orders) {
  try {
    const phpSessId = await getPhpSessId(order.authToken);
    if (!phpSessId) {
      await logError("Failed to fetch PHPSESSID for order " + order.id);
      errorCount++;
      continue;
    }
 
    const params = order.isDinner
      ? new URLSearchParams({
          pack_id: order.user.dinnerPackId,
          qr_type: order.user.dinnerQrType,
          nonveg_type: order.user.diNonvegType,
          qr_type_id: order.user.dinnerQrTypeId,
        })
      : new URLSearchParams({
          pack_id: order.user.brfastPackId,
          qr_type: order.user.brfastQrType,
          nonveg_type: order.user.brNonvegType,
          qr_type_id: order.user.brfastQrTypeId,
        });
 
    const placed = await placeFoodOrder(
      order.authToken, phpSessId, params.toString()
    );
 
    if (placed) {
      const removed = await removeOrder(order.id);
      if (removed) {
        successCount++;
      } else {
        await logError(`Order ${order.id} placed but failed to remove from database`);
        errorCount++;
      }
    }
  } catch (err) {
    await logError(`Error processing order ${order.id}: ${err.message}`);
    errorCount++;
  }
}

Explanation:

  • Sequential processing — orders are placed one at a time in a for...of loop rather than in parallel with Promise.all(). This was an intentional decision to avoid hammering the FF21 server with hundreds of simultaneous requests, which could trigger rate limiting or even IP bans.
  • A fresh PHPSESSID is obtained for each order by calling getPhpSessId() with the user's stored auth token. PHP sessions expire, so reusing a stale one would cause silent failures.
  • The isDinner boolean determines which pack configuration to use. The same user can have both breakfast and dinner orders scheduled — each with different pack IDs and preferences.
  • After a successful placement, the order is deleted from the database. This is fire-and-forget cleanup — the order served its purpose. If the delete fails, we log it but don't retry the order itself (that would cause a duplicate).

The Actual HTTP Request to Place an Order

The placeFoodOrder() function sends the order to the PHP backend as a form POST:

async function placeFoodOrder(authToken, phpSessId, params) {
  const url = `https://www.ff21.co.in/user_app/get_qr_service.php?authTokken=${authToken}`;
 
  const headers = {
    Cookie: `PHPSESSID=${phpSessId}`,
    Referer: `https://www.ff21.co.in/user_app/get_qr_service.php?authTokken=${authToken}`,
    "Cache-Control": "max-age=0",
    Origin: "https://www.ff21.co.in",
    "Content-Type": "application/x-www-form-urlencoded",
    "X-Requested-With": "com.ff21.community",
    // ... browser-mimicking headers
  };
 
  const res = await fetch(url, {
    method: "POST",
    headers,
    body: params,
  });
 
  if (!res.ok) {
    throw new Error(`HTTP ${res.status} - ${res.statusText}`);
  }
 
  return true;
}

Explanation:

  • The URL includes the authTokken as a query parameter — yes, their backend requires it both in the URL and validated via the PHP session cookie. Defense in depth, accidental or not.
  • The Referer header is set to the same page URL — without it, the server rejects the POST as a possible CSRF attempt.
  • The body is application/x-www-form-urlencoded — standard HTML form submission. The params contain pack_id, qr_type, nonveg_type, and qr_type_id.
  • We only check res.ok (HTTP 2xx status). The server doesn't return a structured JSON response for order placement — it returns an HTML page. A 200 means the order was accepted.

Scraping Meal Pack Configuration

One of the more interesting parts is how we extract the user's meal pack info. The food ordering page is a server-rendered HTML form, and we scrape it using Cheerio (controller/order/get_packInfo.js):

async function getPackInfo(guestId, phpSessId) {
  const randValue = Math.floor(10000000 + Math.random() * 90000000);
  const timestamp = Date.now();
 
  const url = `https://www.ff21.co.in/user_app/book_your_meal_tab.php?guest_id=${guestId}&rand${randValue}&_=${timestamp}`;
 
  const response = await fetch(url, { method: "GET", headers });
  const html = await response.text();
 
  const $ = cheerio.load(html);
  let formData = {};
 
  const form1 = $("#formsub_1");
  if (form1.length > 0) {
    formData = {
      brfastPackId: form1.find('input[name="pack_id"]').val() || "485",
      brNonvegType: form1.find('input[name="nonveg_type"]').val() || "0",
      brfastQrType: form1.find('input[name="qr_type"]').val() || "2",
      brfastQrTypeId: form1.find('input[name="qr_type_id"]').val() || "8",
    };
  }
 
  const form2 = $("#formsub_2");
  if (form2.length > 0) {
    formData.dinnerPackId = form2.find('input[name="pack_id"]').val() || "486";
    formData.dinnerQrType = form2.find('input[name="qr_type"]').val() || "3";
    formData.dinnerQrTypeId = form2.find('input[name="qr_type_id"]').val() || "7";
    formData.diNonvegType = form2.find('input[name="nonveg_type"]').val() || "0";
  }
 
  return formData;
}

Explanation:

  • The URL includes a random value and a timestamp as cache-busting query parameters — the real app does this too, and without them the server sometimes returns stale cached HTML.
  • Cheerio parses the HTML like jQuery on the server side. The food ordering page contains two hidden forms: #formsub_1 for breakfast and #formsub_2 for dinner.
  • Each form has hidden <input> fields with the user's pack configuration — pack_id, qr_type, nonveg_type, and qr_type_id. These are the exact values that need to be POST-ed to place an order.
  • The fallback values ("485", "486", etc.) are defaults observed in the intercepted traffic — if the HTML parsing fails, we fall back to the most common pack configuration rather than crashing.

Similarly, the guest ID is extracted by scraping a data-content-url attribute from the HTML (controller/order/get_guestId.js):

const html = await response.text();
const $ = cheerio.load(html);
const dataContentUrl = $("[data-content-url]").attr("data-content-url");
 
if (dataContentUrl) {
  const guestId = dataContentUrl.split("guest_id=")[1]?.split("&")[0];
  return guestId;
}

Explanation:

  • The get_qr_service.php page contains an element with a data-content-url attribute that holds a URL containing the guest_id parameter.
  • We extract it with a simple string split — no regex needed, just split on guest_id= and take everything before the next &.
  • The guest ID uniquely identifies the resident in the food ordering system and is needed for fetching their specific meal pack info.

Scheduling & Cron Jobs

Timing Strategy

The FF21 system has a hard cutoff: orders for tomorrow must be placed before 6 PM today. To handle this, the cron job runs twice a day:

  1. 2:00 PM IST — The first run processes the bulk of the orders early. This drastically reduces the queue size ahead of the deadline.
  2. 5:55 PM IST — The final sweep picks up any orders placed after 2 PM. Since most of the batch was already handled in the first run, this pass is fast — typically finishing in under a minute with only a handful of orders left.

The orderFood.js script is designed as a standalone executable — it runs, processes all pending orders, and exits:

// Run the main function
orderForTomorrow().catch(async (err) => {
  await logError(`Script execution failed: ${err.message}`);
});

Explanation:

  • This is invoked as node orderFood.js by the cron scheduler.
  • The .catch() at the top level ensures that even if the entire script crashes (uncaught promise rejection, database connection failure, etc.), we get a log entry. No silent failures.
  • The function resolves naturally after processing all orders — no explicit process.exit() needed.

Execution Logging

Every cron run produces a summary log with execution metrics:

const executionEnd = new Date();
const executionTime = executionEnd - executionStart;
 
await logSuccess(
  `Execution completed in ${executionTime}ms - Success: ${successCount}, Errors: ${errorCount}, Total: ${orders.length}`
);

Explanation:

  • We track wall-clock time from start of execution to completion.
  • The success/error counters give an at-a-glance view of each batch's health.
  • Log files are written to logs/order-success.log and logs/order-errors.log with ISO timestamps, making it easy to grep for issues.

Web App & User Onboarding

Architecture Overview

The backend is a Node.js + Express API server with Prisma ORM connected to a PostgreSQL database. The frontend (not in this repo) is a separate Vite/React SPA deployed at ffood.devakash.in.

The entry point (index.js) sets up the Express server with two route groups:

const app = express();
 
const allowedOrigins = ["http://localhost:5173", "https://ffood.devakash.in"];
 
app.use(
  cors({
    origin: (origin, callback) => {
      if (!origin || allowedOrigins.includes(origin)) {
        return callback(null, true);
      }
      callback(new Error("Not allowed by CORS"));
    },
    credentials: true,
  })
);
 
app.use(express.json());
app.use("/auth", authRouter);
app.use("/order", verifyUser, orderRouter);

Explanation:

  • CORS is whitelisted to only the local dev server (localhost:5173 — Vite's default port) and the production domain. The !origin check allows server-to-server requests (like the cron job) that don't send an Origin header.
  • /auth routes are public — they handle OTP send/verify before the user is authenticated.
  • /order routes are protected behind the verifyUser JWT middleware — every request must include a valid Bearer token.

API Routes

Auth routes (routes/auth.js):

authRouter.post("/send-otp", sendOtp);
authRouter.post("/verify-otp", verifyOtp);

Order routes (routes/order.js):

orderRouter.get("/fetch", getOrders);
orderRouter.post("/place", placeOrder);
orderRouter.post("/remove", removeOrder);

Explanation:

  • The API surface is intentionally minimal — just 5 endpoints. Users can: authenticate (2 endpoints), view their scheduled orders, add a new order, or cancel an existing one.
  • The verifyUser middleware runs before any order route, so req.user_phone is always available inside the handlers.

User Registration Flow

When a user first signs up:

  1. Frontend → sends phone number to POST /auth/send-otp
  2. User → receives OTP on their phone
  3. Frontend → sends phone + OTP to POST /auth/verify-otp
  4. Backend → runs the full auth chain (OTP → access token → session → auth token → PHP session → guest ID → pack info)
  5. Backend → stores everything in PostgreSQL, returns a JWT to the frontend
  6. Frontend → stores the JWT and uses it for all subsequent API calls

Scheduling an Order

Users schedule orders through the web app by selecting dates and meal types. The place_order.js controller validates and creates the order:

const placeOrder = async (req, res) => {
  const phone_number = req.user_phone;
  const { orderDate, isDinner } = req.body;
 
  const parsedDate = new Date(orderDate);
 
  // Can't order food for weekends
  if (parsedDate.getDay() == 0 || parsedDate.getDay() == 6) {
    return res.status(400).json({ error: "Food can't be ordered for weekends!" });
  }
 
  // Check the 6 PM deadline for tomorrow's orders
  const now = new Date();
  const tomorrow = new Date(now);
  tomorrow.setDate(now.getDate() + 1);
  tomorrow.setHours(0, 0, 0, 0);
 
  if (now.getHours() >= 18 && parsedLocal.getDate() == tomorrow.getDate()) {
    return res
      .status(400)
      .json({ error: "Food for tomorrow can only be ordered before 6PM!" });
  }
 
  // Must be for a future date
  if (parsedLocal < tomorrow) {
    return res.status(400).json({ error: "Food order must be for tomorrow!" });
  }
 
  const dbRes = await prisma.order.create({
    data: {
      authToken: user.authToken,
      orderDate: new Date(orderDate),
      isDinner,
    },
  });
 
  return res.status(201).json({ orderId: dbRes.id, message: "Order placed" });
};

Explanation:

  • Weekend check: FF21 doesn't serve meals on Saturdays (6) and Sundays (0).
  • 6 PM deadline: If it's currently after 6 PM and the user is trying to order for tomorrow, we reject it — the automation wouldn't be able to place it in time.
  • Future date check: Users can't order for today or any past date.
  • The order is stored with the user's authToken (which links it to the User table via a foreign key) along with the date and meal type (isDinner: true for dinner, false for breakfast).
  • Note that the order isn't placed immediately — it's just saved to the database. The cron job picks it up during one of its two daily runs (2:00 PM and 5:55 PM).

Data Model

The Prisma schema (prisma/schema.prisma) is lean — just two tables:

model User {
  phone     String   @id
  avatar    String
  name      String
  authToken String   @unique
  sessionId String
  createdAt DateTime @default(now())
  updatedAt DateTime @default(now())
 
  guestId        String
  brfastPackId   String
  brNonvegType   String
  brfastQrType   String
  brfastQrTypeId String
 
  dinnerPackId   String
  diNonvegType   String
  dinnerQrType   String
  dinnerQrTypeId String
 
  Orders Order[]
}
 
model Order {
  id        String   @id @default(uuid())
  authToken String
  orderDate DateTime
  isDinner  Boolean
  user      User     @relation(fields: [authToken], references: [authToken])
}

Explanation:

  • User uses the phone number as the primary key — guaranteed unique across FF21 residents. The authToken is also marked @unique because orders reference it as a foreign key.
  • Each user stores both breakfast and dinner pack configurations — these are the hidden form field values scraped from the food ordering HTML page. Storing them means we don't need to re-scrape the page every time we place an order.
  • Order is a simple queue table. Each row represents a pending order — when the cron job runs, it reads all orders for tomorrow's date, places them, and deletes the rows.
  • The uuid() default for id gives each order a unique identifier that users can use to cancel specific orders.

Deployment & Infrastructure

ComponentTechnology
RuntimeNode.js (Express 5)
DatabasePostgreSQL
ORMPrisma
SchedulingOS-level cron job hosted on EC2 t2-micro instance
FrontendVite + React (separate repo) hosted on Vercel

Scaling Challenges & Lessons

From 1 to ~700 Users

When I first built this, it was for myself — one user, one order. Scaling to ~700 brought several challenges:

Sequential Processing Was Intentional

I chose to process orders sequentially rather than in parallel. With ~700 orders per batch:

  • Parallel (naive): 700 simultaneous HTTP requests to ff21.co.in → almost certainly triggers rate limiting, IP blocks, or crashes their PHP server.
  • Sequential: Takes a few minutes, but every order goes through cleanly. At ~300ms per order (network round-trip + PHP processing), 700 orders take roughly 3.5 minutes. The first run at 2 PM handles the bulk, so the final 5:55 PM sweep only processes a small handful — finishing well before the 6 PM deadline.

Session Freshness

Each order gets a fresh PHPSESSID rather than reusing one. PHP sessions on the FF21 server have an unpredictable expiry window. Getting a fresh session per order adds ~200ms of overhead but eliminates an entire class of stale-session failures.

Reliability & Monitoring

Dual Log Files

The system writes to separate success and error log files with ISO timestamps:

async function logSuccess(message) {
  const timestamp = new Date().toISOString();
  const logEntry = `[${timestamp}] SUCCESS: ${message}\n`;
  await fs.appendFile(SUCCESS_LOG, logEntry);
}
 
async function logError(message) {
  const timestamp = new Date().toISOString();
  const logEntry = `[${timestamp}] ERROR: ${message}\n`;
  await fs.appendFile(ERROR_LOG, logEntry);
}

Explanation:

  • Splitting success and error logs means I can tail -f logs/order-errors.log to monitor failures without wading through hundreds of success lines.
  • ISO timestamps make log entries sortable and parseable by standard log analysis tools.

Error Isolation

Each order in the batch is wrapped in its own try-catch. One failing order never takes down the entire batch:

for (const order of orders) {
  try {
    // ... process order ...
  } catch (err) {
    await logError(`Error processing order ${order.id}: ${err.message}`);
    errorCount++;
  }
}

Explanation:

  • If user #347's auth token expired, they get logged as a failure, but users #348 through #700 still get their orders placed.
  • The final summary log tells you exactly how many succeeded vs. failed, making it easy to spot systemic issues (e.g., if the error count suddenly jumps from 2 to 200, the FF21 server might be down).

Edge Cases Handled

  • Weekends: The placeOrder controller rejects orders for Saturdays and Sundays since FF21 doesn't serve meals on weekends.
  • Past deadline: After 6 PM, users can't schedule orders for tomorrow (they've already missed the window).
  • Database cleanup: Successfully placed orders are deleted from the Order table. The table acts as a processing queue, not a permanent record.
  • Prisma error codes: The removeOrder controller specifically handles Prisma's P2025 error code (record not found) — which could happen if a user cancels an order right as the cron job is processing it.

Conclusion

What started as "I wonder if I can automate my food order" became a full system managing daily meals for ~700 people. Here are the key technical takeaways:

  1. Reverse-engineering doesn't require magic — a proxy tool, a rooted emulator, and patience will get you surprisingly far. Most "closed" APIs are just undocumented, not truly locked down.

  2. Mimic the client exactly — the biggest source of failures was header mismatches. The Milo API wants okhttp/4.2.1 as the User-Agent. The PHP backend wants browser headers with X-Requested-With: com.ff21.community. Get these wrong and you get inexplicable 403s.

  3. Separate concerns ruthlessly — the auth chain (5 sequential API calls) runs once at signup. The ordering logic only needs the stored authToken and pack IDs. By caching everything at registration time, the critical path (the cron runs) is as fast as possible.

  4. Sequential > parallel when you don't control the server — bombarding someone else's PHP backend with 700 parallel requests is a great way to get IP-banned. Sequential processing with a comfortable time buffer is boring but reliable.

  5. Treat the order table as a queue — orders go in via the web app, get processed by the cron job, and are deleted after placement. Simple state machine: pending → placed → deleted.

  6. Log everything, separate success from failure — when something breaks at 5:55 PM and 700 people might not get dinner, you need to know exactly what happened within seconds.

The whole codebase is just ~15 files and ~800 lines of JavaScript. No frameworks beyond Express, no message queues, no microservices — just a well-structured Node.js app that does one job reliably. Sometimes the best architecture is the simplest one that works.