
Financial Reporting Dashboard
Automate manual financial report and design insightful dashboards
ORGANIZATION
Education Institute
TEAM
Design, Data, Development
duration
3 months
ROLE
UX Designer
Design Requirements, Information Architecture, Wireframing, UI Iteration, Design Reviews
Problem
The client’s current financial reports were manually produced, table-heavy, and difficult to scan. Leadership and finance teams struggled to quickly identify risks or required actions, and the system reporting team spent significant time compiling rather than interpreting data.
Users need a better way to review and report their financial data so that they can better monitor financial health and mitigate risks.
Background & Goals
The client engaged us to design automated dashboards and reporting tables in Oracle Fusion Data Intelligence (FDI).
We identified a few important goals:
Automate manual reporting processes
Standardize report structure across teams
Simplify workflows for finance users
Surface actionable metrics without overwhelming detail
There are three main user groups of the dashboards:
Executive Leaders: Leadership needed quick, high-level indicators and exception visibility.
Finance Team: Finance teams needed the ability to drill into supporting details.
System Reporting Team: The system report team needed maintainable, scalable designs aligned with Oracle FDI capabilities.
outcome & impact
I was able to simplify complex financial workflows, design within enterprise constraints, and deliver high-impact solutions under aggressive timelines.:
Delivered 15 FDI financial dashboard designs in 6 weeks
Established a repeatable dashboard framework that supported rapid delivery without sacrificing clarity.
Balanced simplicity and functionality within system constraints, avoiding unnecessary customization.
design approach
I joined the project mid-sway while it was behind schedule. To regain momentum, I focused on speed, alignment, and reducing rework.
Rapid Domain Immersion: Because I was not part of initial requirement gathering, I closed the knowledge gap quickly.
Reviewed current-state reports and requirement recordings in detail
Extracted and prioritized core user needs
Researched key financial reporting concepts (GL & Planning) to ensure accuracy in how metrics were structured and presented
Tight Cross-Functional Cadence: To prevent further delays, I established a structured collaboration rhythm:
Daily alignment with the Design Manager to validate direction
Daily touchpoints with development and data teams to confirm feasibility within Oracle FDI and surface constraints early
Structured, High-Efficiency Design Reviews: I redesigned how we ran client reviews to maximize decision-making speed.
Before each session: Prepared targeted questions to guide discussion toward specific decisions
During reviews: Clarified the financial context and key takeaways upfront; walked through wireframes section by section; captured feedback live in Figma to reduce follow-up cycles
This approach enabled us to maintain a delivery pace of 2–3 reports per week despite complex data dependencies and system constraints.
challenges & decisions
Challenge 1: Aligning on Data Model Before Visual Design
During early design reviews, I noticed the client struggled to evaluate wireframes because they were unclear on how current-state reports would translate into Oracle FDI data structures. As a result, the discussions focused more on “where is this coming from?” than “is this the right way to present it?”
The Decision
I flagged this pattern to the Design Manager and development team and recommended we clarify with the client on how current-state reports would map to future-state data structures.
We decided to change our design review structure. First, we aligned with the client on the data mapping and clarify any questions. Once the client understood the data context, we then presented the dashboard wireframes and asked for targeted design feedback.
Why It Mattered
Improved quality and specificity of client feedback
Accelerated sign-off by separating data debates from design decisions
Challenge 2: Show Everything vs. Show What Matters
Stakeholders frequently requested that multiple data points, metrics, and breakdowns be displayed on a single dashboard view. The instinct was to make reports as comprehensive as possible.
However, early drafts revealed that densely packed dashboards increased cognitive load and diluted focus. Instead of clarifying performance, they recreated the same problem as the manual reports — too much information, not enough prioritization.
The Decision
I introduced visual hierarchy and progressive disclosure to shift the focus from completeness to decision-making by:
Structuring dashboards with summary KPIs and high-level trends at the top
Surfacing variances, exceptions, and deltas before detailed breakdowns
Moving secondary metrics into drill-down views or supporting tables
Using layout and spacing intentionally to guide scanning behavior
Rather than asking, “What data should we include?” I reframed discussions to, “What decision should this view support?”
Why It Mattered
Reduced visual noise and improved scanability
Helped users identify key financial signals within seconds
Challenge 3: Designing for Security-Based Data Variations
User access to data varied based on security permissions. Depending on role, some users would see complete datasets while others would see partial or limited data.
This created a risk: dashboards that looked clear and meaningful in full-data scenarios could appear empty, misleading, or confusing when data was restricted. Additionally, stakeholders were hesitant to sign off on designs without seeing how real data across different roles would render in the system.
The Decision
I proactively identified key security scenarios and designed for edge cases rather than assuming full visibility.
This included:
Working with the development team to understand how data filtering would affect visual outputs
Structuring charts and KPIs to remain interpretable even with limited data
Aligning stakeholders on expected variations before finalizing layouts
Instead of treating security constraints as a downstream issue, I incorporated them directly into the design strategy.
Why It Mattered
Prevented confusion caused by incomplete data views
Increased confidence in dashboard behavior across user types
Ensured consistency and usability regardless of data visibility level
Wireframes
Reflection
What went well:
Strong cross-functional collaboration: By aligning on data models and development feasibility, we reduced rework and kept delivery moving despite aggressive timelines. This collaboration also improved the quality of design reviews, focusing more on business decisions rather than technical uncertainty.
What I wish I had done differently:
Visual consistency and design governance: Given the speed of delivery, maintaining consistency across dashboards was challenging. To mitigate this, we developed a lightweight guidance document for developers outlining layout rules and naming standards. However, with more time, I would formalize this into a scalable design system with reusable components and documented visualization standards
Balancing brand, accessibility, and data visualization best practices: Designing within enterprise brand constraints while ensuring accessibility and effective data storytelling required constant tradeoffs. Some brand colors lacked sufficient contrast for accessible data visualization, and balancing aesthetic alignment with clarity was an ongoing tension. In the future, I would advocate for early alignment on a dedicated data visualization color palette that satisfies both brand and accessibility standards.



