When a Pasifika community organisation opens a new piece of software for the first time, the disconnect is almost always immediate. The fields do not match how the organisation thinks about its work. The structure assumes individuals, not families. The language defaults to corporate English. The data lives on a server in Virginia or Oregon. And the entire system carries an unspoken assumption that every organisation operates the same way, regardless of culture, community, or context.
For Pasifika and Maori community organisations in Aotearoa New Zealand, this is not just an inconvenience. It is a fundamental barrier. The technology available to the community sector was designed by people who do not understand Pacific community work, and it shows in every field, every workflow, and every default setting.
Why Mainstream Tech Does Not Fit Pacific Community Work
Western software design assumes a specific model of service delivery: an individual client receives a service from an individual provider, and the transaction is documented in a case file. This model works reasonably well for clinical settings, corporate consulting, and government agencies. It does not work for Pacific community organisations where the unit of care is the family, not the individual.
In Samoan culture, the concept of aiga (family) extends far beyond the nuclear household. When a young person is referred to a mentoring programme, their aiga is part of the picture. Grandparents, aunties, uncles, older cousins, and church leaders all play a role in the young person's development. A mentoring platform that only has a field for "parent/guardian" misses the entire support network.
Similarly, Maori organisations working within a kaupapa Maori framework need technology that supports whanau centred practice. The whanau is the foundation of wellbeing, and any system that fragments the whanau into individual case files works against the cultural values that underpin the organisation's approach.
Cultural Safety in Software Design
Cultural safety is a concept that the health and social services sectors understand well, but the technology sector has largely ignored. In the context of software design, cultural safety means creating systems that respect and support the cultural practices of the people using them and the communities being served.
For Pasifika and Maori community organisations, cultural safety in software includes:
- Whanau and aiga centred data structures. The ability to record family relationships, extended family networks, and community connections as a core part of a young person's profile, not an afterthought.
- Te reo Maori and Pacific language support. Field labels, status options, and categories that can be displayed in te reo Maori or other Pacific languages. The software should not force everything into English.
- Cultural practices in session recording. When a mentoring session begins with a karakia or a prayer, the session log should have a natural place to record that. When a session involves whanau or community, the system should accommodate multiple participants without treating it as an exception.
- Respectful data handling. In Maori and Pacific cultures, personal information carries mana. The way data is collected, stored, and shared must reflect this. Broad data sharing defaults that are common in Western software can violate cultural expectations around information sovereignty.
The Story Behind Poto AI
Poto AI was not created in a Silicon Valley incubator or a corporate innovation lab. It was born from direct experience in the Pacific community sector. The platform is named in honour of Potoaʻe Satiu Kaho, and its development was driven by a simple observation: the tools available to community mentoring organisations were failing them.
Gus Gale, the founder of Poto AI and Ask Yr Grandpa, saw firsthand how programme coordinators at Pacific community organisations spent more time fighting their software than using it. They were copying data between spreadsheets, reformatting reports for different funders, and maintaining compliance records in systems that had no concept of Police Vetting, Oranga Tamariki, or MSD Youth Service contracts.
The vision for Poto AI was to build technology that understood Pacific community work from the inside. Not a generic platform with Pacific branding, but a system designed around the actual workflows, compliance requirements, and cultural values of community mentoring organisations in Aotearoa New Zealand and Australia.
Pacific Values Embedded in the Platform
Poto AI is built on a foundation of Pacific values. These are not marketing slogans. They are design principles that shape how the platform works.
Alofa (Love, Compassion)
Every feature is designed to reduce burden on frontline workers, so they can spend more time with the people they serve. AI handles the administration; humans handle the relationships.
Aiga (Family)
The platform supports whanau and aiga centred practice with family relationship mapping, multi participant sessions, and holistic views that show the young person within their community context.
Va (Relational Space)
The va between mentor and mentee, between organisation and community, between service provider and funder, is respected in how data flows through the system. Relationships are central, not transactional.
Tautua (Service)
The platform exists to serve community organisations, not to extract value from them. Pricing is designed for NFP budgets. Features are driven by what organisations actually need, not what generates the most revenue.
NZ Compliance That Matters
Pasifika and Maori community organisations in New Zealand operate within a specific regulatory environment, and compliance is not optional. The organisations that serve the most vulnerable communities are often subject to the most rigorous oversight. Purpose built technology should make compliance easier, not harder.
Poto AI addresses the compliance requirements that matter most for community organisations in Aotearoa:
- Privacy Act 2020. Data access controls, consent tracking, and audit trails that meet the requirements of New Zealand's privacy legislation. Staff see only the data they need for their role.
- Oranga Tamariki. Incident recording and reporting workflows that align with the requirements for organisations working with children and young people under Oranga Tamariki oversight.
- MSD Youth Service. Outcome frameworks built into session logging and progress tracking, so reporting against MSD contract requirements is automatic rather than manual.
- Police Vetting. Automated tracking of vetting status and expiry dates for all staff and mentors, with reminders sent before certificates expire.
Data Sovereignty: Where Your Data Lives
For Pasifika and Maori organisations, data sovereignty is not an abstract concept. It is a practical concern with cultural dimensions. When an organisation's data about young people, families, and communities sits on servers in the United States, it is subject to US laws, US government access provisions, and US corporate policies. This is not acceptable for many Pacific community organisations.
Poto AI's data is hosted on Australian servers, within the jurisdiction of Australian and New Zealand privacy law. The data never crosses to US servers. The organisation retains full ownership of its data. And the platform's access controls ensure that only authorised staff can see sensitive information.
Data sovereignty in practice: When a community organisation collects information about a young person and their whanau, that data carries responsibility. It should be stored in a jurisdiction that the organisation trusts, managed under privacy laws that the organisation understands, and accessible only to the people who have a legitimate reason to see it. These are not technical requirements. They are cultural obligations.
Building Technology That Serves Community
The Pacific community sector in Aotearoa does extraordinary work. Mentoring programmes transform the trajectories of young people. Cultural programmes strengthen identity and belonging. Community organisations fill gaps that government services cannot reach. This work deserves technology that honours it.
That does not mean technology with Pacific patterns on the login screen and Western logic underneath. It means technology built by people who understand Pacific community work, designed around Pacific values, and shaped by the actual needs of the organisations doing the mahi. It means software where whanau centred practice is the default, not a workaround. Where cultural safety is built into the data structures, not bolted on as a feature. Where compliance with NZ legislation is automatic, not an afterthought.
Poto AI is that technology. Built for Pasifika and Maori community organisations, designed around the values that drive their work, and committed to making the administrative burden lighter so that every possible hour goes back to the people and communities that need it most.
About Poto AI: Poto AI is an AI powered programme management platform built for community mentoring organisations in New Zealand and Australia. Named in honour of Potoaʻe Satiu Kaho, it brings 36 features and 11 AI tools together in one system, with Pacific values at its core. Learn more at poto-ai.com