blog
What to Expect at Every Stage of Enterprise App Development
By Mohan S Development App development Enterprise Mobility April 6, 2026
You're about to make a significant investment - in budget and in months of your team's time. You've approved the scope, selected a vendor, and now someone asks: "So what actually happens next?"
Most agencies won't give you a straight answer. They'll send you a methodology deck with circles and arrows and words like "iterative" and "agile." What they won't tell you is where projects actually go wrong - and it's almost never where you expect.
After nearly two decades working with organisations like Audi, Daimler, Scoot, UOB, and NEA, here's what should happen at every stage of enterprise app development, what the warning signs look like, and where the real risks hide.
The 6 Phases (And the Timeline Nobody Wants to Hear)
Discovery (2 weeks) → Design (3-4 wks) → Development (8-16 wks) → Testing (2-4 wks) → Launch (1-2 wks) → Growth (ongoing)
Total timeline for a medium-complexity enterprise app: 4-7 months ideal. 6-9 months realistic.
That gap isn't padding. It's honesty. McKinsey's research on large IT projects found the average cost overrun reaches 45% above original budget, with the majority running over schedule. Knowing why is the first step to beating those odds.
Phase 1: Discovery (1-2 weeks)
This is the phase that separates expensive failures from successful products. Research consistently backs this up - projects that invest in thorough requirements discovery are 50% more likely to succeed. Yet it's the phase most buyers want to skip because it doesn't feel like "real work."
It is the most real work you'll do.
What happens:
Stakeholder alignment (Days 1-3)
- Who are the project sponsors - and do they agree on what success looks like?
- What business problem does this app solve?
- How will success be measured (specific KPIs, not vibes)?
- What are the hard constraints (budget, timeline, compliance)?
Here's the contrarian truth about stakeholder alignment: the people commissioning the app are almost never the people who'll use it. The executive sponsor has one vision. The operations lead has another. The IT team has constraints neither of them know about. Discovery is where you surface these conflicts - before they surface as change orders in month four.
User research (Days 3-7)
- Who are the actual users? (Not who you think - who actually will use this daily)
- What's their current workflow? (Often manual, often involving spreadsheets, often involving workarounds nobody documented)
- What are their pain points?
- What devices, connectivity, and context do they work in?
For one government port authority client, this single phase changed everything. User research revealed that workers needed to use the app outdoors in Singapore's heat, often wearing gloves. They couldn't tap small buttons. They couldn't read low-contrast text in direct sunlight. They couldn't type long form fields with gloves on. This wasn't an edge case - it was the primary use case. The entire UI was redesigned around larger touch targets, high contrast, and minimal text input. Without discovery, we'd have built a perfectly usable app that nobody could actually use.
Technical discovery (Days 5-10)
- What systems does the app need to integrate with? (And do those systems have documented APIs, or will you be reverse-engineering?)
- What data does it need to access?
- What are the security and compliance requirements?
- What's the existing technical landscape?
When Scoot engaged us for their cabin crew app, discovery took under two weeks - but the insights from that phase enabled us to deliver the full app in under two months, ultimately saving 31,500 man-hours per year. Samuel Chandra from Scoot put it plainly: "We work lean. No unnecessary positions, no long reports just to prove we did the work." Speed and rigour aren't opposites. A focused discovery makes everything downstream faster.
What you get:
- Project brief (aligned, written, signed off by all stakeholders)
- User personas based on actual research, not assumptions
- High-level feature map with priorities
- Technical architecture overview
- Refined cost estimate and timeline
The uncomfortable question:
"Is this the right thing to build?" Discovery should answer this honestly. Sometimes the answer is "not exactly" - and that saves you from building the wrong product entirely. For DB Schenker, discovery research revealed that the original smartwatch concept wouldn't work for their field teams. We pivoted to an Android wearable approach before a single line of code was written. That's a discovery success story, not a failure.
Phase 2: Design (3-4 weeks)
Design is where the app takes shape - before a single line of code is written. This phase is the cheapest place to be wrong. Moving wireframes around costs hours. Rewriting code costs weeks. Forrester research shows that every dollar invested in UX during design returns $100 during development.
Week 1-2: UX Design
- User flows: Map every path through the app. Every button, every screen, every decision point.
- Wireframes: Low-fidelity layouts for all screens. Focus on structure and logic, not aesthetics.
- Information architecture: How content is organised and navigated.
- Prototype: Clickable wireframes that simulate the real app experience.
The contrarian observation here: most enterprise apps are over-designed for stakeholders and under-designed for users. The executive demo looks great. The person using it 200 times a day finds it slow and frustrating. Good design optimises for the 200-times-a-day user, not the boardroom presentation.
Week 2-3: User Testing
- Test the prototype with 5-8 real users
- Watch them attempt core tasks (not "explore the app" - specific tasks)
- Identify confusion points, dead ends, unnecessary steps
- Iterate the design based on findings
Week 3-4: Visual Design
- Design system: Colours, typography, spacing, components - all documented and reusable
- High-fidelity mockups: Pixel-perfect designs for all screens
- Interaction specifications: How things move, transition, and respond to user input
- Asset preparation: Icons, illustrations, images - export-ready for development
What you get:
- Complete UI/UX design files (Figma)
- Interactive prototype
- Design system documentation
- User testing report
- Design-to-dev handoff specs
The expensive assumption:
Skipping user testing. "We know our users" is the most expensive sentence in enterprise software. Nielsen Norman Group research proves it: five user tests reveal 85% of usability issues. Fifty stakeholder meetings won't find what five user tests will.
Phase 3: Development (8-16 weeks)
Now we build. With validated designs in hand, development is focused and efficient - not a guessing game.
Sprint structure (2-week cycles)
Each sprint:
- Sprint planning (what gets built this cycle)
- Development (building features)
- Internal QA (testing as we go)
- Sprint demo (show you working software)
- Sprint review (adjust priorities for next sprint)
Typical team for a medium enterprise app:
Project Manager - Timeline, scope, communication
UX/UI Designer - Design QA, refinements during build
Frontend Developer (iOS) - iOS app interface and logic
Frontend Developer (Android) - Android app interface and logic
Backend Developer - API, database, server logic
QA Engineer - Testing throughout development
What you should see:
- Bi-weekly demos of working software - not slide decks, not progress reports. You should be tapping through the app every 2 weeks.
- Transparent progress tracking: Access to the project management tool (Jira, Asana, etc.). See what's done, what's in progress, what's blocked.
- Regular communication: Weekly status updates at minimum. Instant communication for blockers.
- Change management: A clear process for scope changes - what it costs, what it delays, and whether it's worth it.
What to watch for:
- Silence is bad. If you don't hear from the team for 2 weeks, something is wrong. No exceptions.
- Demo delays are a warning. If the agency can't show working software every 2 weeks, the project is behind - regardless of what the status report says.
- Scope creep starts small. "Can we just add..." compounds quickly. PMI research found scope creep affects 52% of projects - and it compounds fast. Every addition should go through the change process.
Here's what most agencies won't say out loud: the biggest development risk isn't technical. It's your third-party integrations. That "simple API connection" to your SAP or Salesforce instance? It depends on their team's availability, their documentation quality, and their deployment schedule. We learned this building Mercedes-Benz SalesTouch - 12 backend systems across 11 markets. Budget integration contingency time, not because your agency is slow, but because systems you don't control have their own timeline.
Phase 4: Testing (2-4 weeks)
Testing isn't a phase at the end - it happens continuously. But the final testing phase is structured, thorough, and non-negotiable.
Testing layers:
Unit testing
What it catches: Code-level bugs
Who does it: Developers (automated)
Integration testing
What it catches: API and system connection failures
Who does it: Developers + QA
Functional testing
What it catches: Features not working as designed
Who does it: QA team
UI/UX testing
What it catches: Design inconsistencies, interaction bugs
Who does it: Designer + QA
Performance testing
What it catches: Slow screens, memory issues, battery drain
Who does it: QA + DevOps
Security testing
What it catches: Vulnerabilities, data leaks
Who does it: Security team / external
UAT (User Acceptance Testing)
What it catches: "Does this actually work for our people?"
Who does it: Your team
Device testing
What it catches: App works across different phones, tablets, OS versions
Who does it: QA team
UAT: Your team's sign-off
This is when your actual users - not just stakeholders - test the app against real scenarios. Give them tasks, not features. "Submit a leave request for next Monday" - not "test the leave request module."
UAT best practices:
- Define test cases in advance (we provide a template)
- Use real data, not test data
- Include edge cases (what happens when the internet drops mid-submission?)
- Document issues with screenshots and steps to reproduce
- Prioritise ruthlessly: critical bugs block launch, minor issues go to V1.1
The contrarian observation: UAT almost always takes longer than planned because organisations underestimate how hard it is to get users to test systematically. Block their calendars. Make it mandatory. Treat it like a project milestone, not a favour you're asking.
For government and enterprise:
- Penetration testing (often required for government tenders)
- PDPA compliance review
- Accessibility testing (WCAG 2.1 standards)
- SingPass/CorpPass integration testing (allow 4-6 weeks for certification - this catches people off guard)
Phase 5: Launch (1-2 weeks)
Launch is a process, not an event. The agencies that treat it like flipping a switch are the ones whose apps crash on day one.
Pre-launch checklist:
- App Store / Play Store listings prepared (screenshots, descriptions, keywords)
- App Store review submitted (allow 1-3 days for Apple, 1-2 days for Google)
- Analytics and crash reporting configured
- Push notification infrastructure tested
- Backend scaling verified for launch traffic
- Support documentation / FAQ prepared
- Internal communication plan (email to users, training sessions)
- Rollback plan in case of critical issues
Launch strategy options:
Big bang - Best for: Consumer apps, marketing-driven launches
Phased rollout - Best for: Enterprise apps (start with one team/department, expand)
Soft launch - Best for: Apps needing real-world validation before full release
Beta group - Best for: Complex apps where early feedback prevents costly fixes
For most enterprise apps, phased rollout is the only responsible choice. Start with a pilot group (50-100 users), collect feedback for 2-4 weeks, fix issues, then expand. Experience consistently shows phased rollouts significantly reduce post-launch critical incidents.
Apple App Store specifics:
- Review can take 1-3 business days
- Enterprise distribution (internal apps) skips public review but requires an Apple Developer Enterprise account
- TestFlight can be used for beta testing with up to 10,000 users
Phase 6: Growth (Ongoing)
The app is live. This is where most agencies disappear and most projects stagnate.
First 30 days: Monitor
- Crash reports and error rates
- User adoption metrics (are people actually using it?)
- Performance benchmarks
- User feedback (in-app, support channels)
- Bug fixes (warranty period covers critical bugs)
First 90 days: Optimise
- Analyse usage data - what's used daily, what's ignored entirely
- Collect user feedback systematically
- Plan V1.1 updates based on real data, not stakeholder wishlists
- Optimise performance issues found in production
Ongoing: Maintain and evolve
- Monthly maintenance (OS updates, security patches, bug fixes)
- Quarterly feature updates based on business needs
- Annual UX review to keep the app current - see 5 Signs Your Enterprise App Needs a Redesign
- Analytics-driven roadmap for continuous improvement
This is where long-term partnerships create compounding value. Clients like First Luxury have worked with us across multiple projects over 5+ years - not because of contracts, but because an agency that knows your systems, your users, and your business context can move twice as fast as one starting from scratch.
The Timeline Reality Check
This table is the most important thing in this article. Print it. Show it to your CFO.
Discovery
Ideal: 1-2 weeks | Realistic: 2-3 weeks
Why it slips: Stakeholder availability
Design
Ideal: 3-4 weeks | Realistic: 4-6 weeks
Why it slips: Internal review cycles
Development
Ideal: 8-16 weeks | Realistic: 10-20 weeks
Why it slips: Scope changes, integration delays
Testing
Ideal: 2-4 weeks | Realistic: 3-5 weeks
Why it slips: Bug fix cycles, UAT scheduling
Launch
Ideal: 1-2 weeks | Realistic: 1-2 weeks
Why it slips: Usually on time
The #1 cause of delays is not your agency. It's internal decision-making.
This is the truth most agencies are too polite to say plainly. If approvals take 2 weeks instead of 2 days at each review point, a 5-month project becomes a 7-month project. The math is simple and the pattern is universal. The pattern is consistent across projects: delays are most often caused by client-side decision-making, not the agency.
The agency can only move as fast as you do. The best thing you can do for your project timeline is assign a single decision-maker with authority - and protect their calendar.
What Your Process Should Feel Like
Here's the test: at no point during the project should you feel surprised.
Not by costs. Not by delays. Not by what gets delivered. If discovery, design, development, testing, and launch each do their job, the app you launch should be recognisably the app you envisioned - refined by research, validated by testing, and built by people who understood the problem.
If your current process feels like a black box, that's not a process problem. That's a partner problem.