What Bedstone Hub is
Bedstone Hub is the internal AI workspace your business actually uses. A private deployment at your own subdomain (ai.yourcompany.com.au, hub.yourcompany.com.au, or whatever fits your brand) that gives your team one chat-style interface to every business system you operate. CRM data, finance data, project status, ticket queues, internal documents, vendor records, customer history. All of it grounded in your own data, with the access controls that mirror what each user can already see in the source systems.
It is a product, not a project. Built on tested patterns, deployed in two to four weeks for the first version, configured to your operations, and operated alongside your team. Your subdomain. Your data. Your model choice. Hosted on infrastructure you control.
Why we built it
Every business we work with has the same problem. The operational data lives in 10 to 30 different systems. The CRM knows the deals. The ticketing system knows the issues. The shared drive knows the contracts. The ERP knows the invoices. Nobody knows all of it at once because no one person can hold that many tabs open.
ChatGPT and Copilot are general assistants. They do not know your customers, your contracts, your operational history, or who has access to what. They are useful for general drafting. They are useless for "what is the status of the McKinnon contract" or "who in the team has touched the Carraway account this quarter".
Internal dashboards and analytics tools sort of help, but the team has to know which dashboard to open, what filter to apply, and what the column names mean. Most staff give up and ping an analyst on Slack. The analyst becomes the bottleneck.
Bedstone Hub removes that bottleneck. The interface is a chat box. The team types the question. The Hub looks at the right systems, grounded in the access the requesting user already has, and returns the answer with the source records linked. Non-technical staff get direct access to operational data without needing to learn the dashboard.
What it does
- Plain-language search across all your systems. "Show me every active customer in QLD that has not been contacted in 30 days." Pulled from CRM, filtered by recency, returned as a list with links to the records.
- Cross-system synthesis. "Summarise the McKinnon account: contract status, open tickets, last invoice, last touchpoint." Pulled from contracts, ticketing, ERP, and CRM. Returned as a single brief.
- Drafting grounded in your data. "Draft a renewal email for the McKinnon account referencing their recent ticket and last project." Drafted with the right context, no fabrication.
- Workflow triggers. "Create a follow-up task for me on this account next Monday." Hub creates the task in your project management or CRM system with the right metadata.
- Document Q&A. "What is our standard payment term in the master services agreement template?" Pulled from your contract library, with citations.
- Operational dashboards on demand. "Show me deals closing this month by owner." Pulled live, formatted as a table, with source links.
- Internal knowledge base. "What is our onboarding process for a new enterprise customer?" Pulled from internal wiki, with the relevant document linked.
- Audit trail. Every Hub query is logged. Your security and compliance team can see who asked what, when, and what data was returned.
Systems it connects to
The Hub is designed to connect to whatever your business actually runs. Common integrations:
- CRM. Salesforce, HubSpot, Pipedrive, Microsoft Dynamics, Zoho, custom CRMs.
- ERP and accounting. NetSuite, SAP, MYOB, Xero, QuickBooks, custom finance systems.
- Document store. SharePoint, Google Drive, Dropbox, Box, OneDrive, on-premise file shares.
- Wiki and docs. Notion, Confluence, GitBook, internal docs sites, Microsoft Loop.
- Ticketing and support. Zendesk, Intercom, Freshdesk, ServiceNow, Jira Service Management.
- Project management. Jira, Linear, Asana, Monday, ClickUp, custom project trackers.
- Communication. Slack, Microsoft Teams, email (with permission), shared inboxes.
- Code and development. GitHub, GitLab, Bitbucket, Azure DevOps.
- Data warehouse and analytics. Snowflake, BigQuery, Redshift, Databricks, Looker, Tableau, Power BI.
- Databases. Postgres, MySQL, SQL Server, MongoDB, with read-only scoped access.
- Object storage. S3, Azure Blob, GCS, with appropriate scoping.
- Custom systems. Internal applications and bespoke databases via REST, GraphQL, or direct database connection.
- On-premise sources. Connected via secure VPN, Direct Connect, or ExpressRoute where the data must stay inside your network.
If a system has an API, we connect to it. If it does not, we work out the right read pattern (scheduled sync, file-based, or supervised database access).
How it works under the hood
Bedstone Hub is built on the production-grounded patterns we have used in client engagements for the last three years. The architecture in plain terms:
- Retrieval layer. Documents and structured data from your connected systems are indexed into a vector and keyword search layer. Hosted in your cloud account, in your region.
- Identity-aware access. Every request carries the requesting user's identity. Retrieval is scoped to records that user can access in the source system. We never return data the user could not access directly.
- LLM call. Question plus relevant retrieved context is sent to the LLM. Response is grounded in the retrieved data, with citations back to the source records.
- Action layer. For workflow actions (create task, send email, update record), the Hub uses scoped API access on the user's behalf. Every action is logged with reversibility where the source system supports it.
- Audit log. Every query, every action, every retrieved record is logged. Surfaced to your security and compliance team.
- Refresh layer. Source data refreshes on schedules appropriate to each system. CRM and ticketing every few minutes, document stores hourly, archived data daily.
- Observability. Latency, accuracy, citation rate, hallucination rate, and cost tracked per query. Your team sees the dashboards.
Model and provider choice
The right LLM depends on your data sensitivity, compliance posture, and budget. Bedstone Hub supports:
- OpenAI (GPT-4o, GPT-5). Through OpenAI Business or Azure OpenAI with AU-region routing. Common default for general-purpose workloads.
- Anthropic Claude (Sonnet, Opus). Through Anthropic Business or AWS Bedrock with AU-region routing. Strong on long-context retrieval and reasoning over operational data.
- Google Gemini. Through Vertex AI in australia-southeast1. Useful for businesses already on Google Workspace.
- Open-weight models on your own infrastructure. Llama, Mistral, Qwen, and others via Ollama, vLLM, or AWS Bedrock with on-account deployment. For workloads that cannot send data to third-party providers.
- Mixed deployment. Sensitive workloads on open-weight models in your VPC, less sensitive workloads on commercial APIs. Routing logic tied to data classification.
We do not lock you into a model. The Hub is designed so model choice is configuration, not architecture. Switching providers takes hours, not weeks.
Access control and identity
Access control is the most important thing the Hub gets right. A salesperson should see their pipeline. The CFO should see ledger data. HR should see employee records. None of them should see things their existing role does not permit.
- SSO integration. The Hub authenticates against your existing identity provider (Entra ID, Okta, Google Workspace, Auth0). No separate password.
- Role inheritance. Each user's role and permissions come from the source systems. The Hub respects them.
- Per-system scope. Where a user has limited access in a source system, the Hub queries with their scope, not a service account that sees everything.
- Document-level access. For document stores with per-file permissions, the Hub only returns documents the requesting user has read access to.
- Field-level redaction. Sensitive fields (salary, PII, financial details) can be redacted based on the requesting user's role.
- Audit logging. Every query, every retrieval, every action logged with user identity. Tied to your existing SIEM if you have one.
- Conditional access. MFA enforcement, device compliance, IP restrictions, all inherited from your identity provider's conditional access policies.
How we deploy it
Bedstone Hub is not SaaS. We deploy it into your environment, on your terms.
- Your cloud account. AWS, Azure, or GCP, in the AU region of your choice. We deploy the infrastructure as code, document it, and hand back keys.
- Your subdomain. ai.yourcompany.com.au, or whatever fits your brand. DNS, TLS, and routing configured.
- Your data, your account. All data stored in your cloud account. No vendor lock-in on the data layer. Backups in your account.
- Your model accounts. Where commercial APIs are used (OpenAI, Anthropic, Google), the API accounts are yours. The Hub is configured to use them.
- On-premise variant. For organisations that cannot run in public cloud, the Hub deploys on-premise on Kubernetes or VM infrastructure. Same architecture, your hardware.
- IRAP / sovereign variant. For workloads requiring IRAP-assessed infrastructure or sovereign cloud, we deploy to the appropriate sovereign environment.
What week one looks like
Most Hub deployments run for two to four weeks for the first usable version. The first week sets up the rest.
- Day 1. Discovery and scope. Which systems first. Which team uses it first. What questions they need to answer. Access requirements mapped.
- Day 2. Identity integration. SSO configured against your identity provider. Test users provisioned. MFA enforcement validated.
- Day 3. First system connection. Pick the most valuable system (usually CRM or document store) and wire up the connector. Read-only, scoped, identity-aware.
- Day 4. First end-to-end query. Internal test users issue real queries against real data. Citations validated. Access scoping tested.
- Day 5. Pilot user onboarding. Small group of pilot users access the Hub at the agreed subdomain. Feedback captured. Refinements scheduled.
Bedstone Hub vs alternatives
ChatGPT Enterprise / Microsoft Copilot. General assistants with limited access to your specific business systems. Useful for drafting and general productivity. Cannot answer "what is the McKinnon contract status" because they cannot see it. Hosted by the vendor in their regions, not yours.
Glean, Notion AI Q&A, similar enterprise search. Better than ChatGPT for internal search across documents. Often limited on operational workflows (CRM, ERP, ticketing). Hosted SaaS with the vendor's data residency and access model.
Build it yourself. A six to twelve month engineering project, with the operational tail (model upgrades, retrieval tuning, access control maintenance) that comes with it. Worth it if you have a senior AI engineering team. Wasteful if you do not.
Per-system AI features (Salesforce Einstein, Microsoft Copilot for Dynamics, etc.). Each one only sees its own data. The salesperson still has to context-switch across five tools. The cross-system synthesis the team actually needs is missing.
Bedstone Hub. Deployed in your cloud account, in your region, at your subdomain. Connects every system, not just one vendor's stack. Identity-aware access from day one. Built and operated by senior engineers who can also build the custom systems it connects to. Right when you want a unified internal AI workspace without the build cost.
Common use cases
- Sales. "Show me at-risk accounts. Draft a renewal email for each. Summarise the last six months of activity on this account before my call." Pulled from CRM, communication, and document data.
- Customer success. "What is the open ticket queue for this customer. What is their contract renewal date. Have they been escalated in the last quarter." Pulled from ticketing, contracts, and account data.
- Operations. "What invoices are overdue more than 30 days. Which customers have the largest exposure. Generate the dunning email batch." Pulled from accounting and CRM.
- Finance. "Reconcile this bank statement against the ledger. Show me the unallocated payments. Draft the monthly board pack with the variance commentary." Pulled from accounting, banking, and historical board packs.
- Legal. "Find every contract with a termination clause shorter than 60 days. Pull the renewal dates. Flag the ones expiring this quarter." Pulled from document store with contract metadata.
- Engineering. "What is the on-call queue. What was the last incident. Find every place the legacy auth library is still used in the codebase." Pulled from PagerDuty, incident tooling, and code repositories.
- HR. "Find every employee due for a performance review this month. Pull the templates. Summarise the last review for each." Pulled from HRIS and document store, with HR-only access.
- Executive. "Brief me on the top three customer issues this week. Pull the revenue impact. Summarise the actions team has taken." Pulled across CRM, ticketing, and finance.
Data residency and compliance
Data residency matters for AU businesses, and Bedstone Hub is designed for it.
- AU-region by default. Vector store, retrieval cache, application logs, audit trail all live in AU. No data crosses borders without explicit configuration.
- Model provider routing. Where commercial LLMs are used, we route to AU-region endpoints where available (Azure OpenAI Australia East, Bedrock ap-southeast-2, Vertex AI australia-southeast1). For workloads that cannot send data offshore, we deploy open-weight models in your VPC.
- Privacy Act and OAIC. Data handling aligned to APP obligations. Breach notification process documented.
- Sector frameworks. APRA CPS 234 for regulated financial entities, ACSC Essential Eight, sector-specific obligations. Configured per engagement.
- IRAP. Where the deployment supports government workloads, deployed to IRAP-assessed environments.
- SOC 2 and ISO 27001. Hub deployments produce the evidence trail your auditor expects.
- Data deletion. When a record is deleted in the source system, it is removed from the Hub retrieval index within minutes.
- No training on your data. Commercial LLM providers configured with the appropriate enterprise terms so your data is not used for model training.
How we structure engagements
Bedstone Hub is sold as a deployment plus operate engagement, not as SaaS subscription. Common shapes:
- Foundation deployment. First three to five systems connected, single department use case, two to four week deployment. Fixed scope, fixed timeline.
- Broad rollout. Connect remaining systems and roll out across the business. Sequenced engagement over four to twelve weeks.
- Operate retainer. Monthly engagement to add new integrations, tune retrieval, update model configuration, respond to user feedback. Recommended for the first six months of usage.
- Infrastructure-only. For organisations with internal AI engineering capability, we deploy the infrastructure and hand over. Less common.
Bedstone Hub deployments commonly qualify for the R&D Tax Incentive given the novel integration work and AI engineering involved. Reach out and we will reply within 24 hours with the shape and pricing that fits your situation.
Common questions
What is Bedstone Hub?
Bedstone Hub is a private AI workspace deployed at your own subdomain (for example ai.yourcompany.com.au) that connects every business system you run, including your CRM, ERP, accounting, document store, ticketing, project management, and internal docs, into one chat-style interface backed by a large language model. Your team asks questions in plain language and gets answers grounded in your actual operational data.
How is it different from ChatGPT or Copilot?
ChatGPT and Copilot are general assistants with no access to your specific business data. Bedstone Hub is a private workspace grounded in your operational systems, with access controls that mirror your existing identity model, and hosted on infrastructure you control. It is the workspace your team uses for internal work, where the public chatbots cannot help.
Where does the data live?
By default, AU-region cloud (AWS ap-southeast-2, Azure Australia East, or GCP australia-southeast1). Your data does not leave Australia without explicit configuration. For IRAP or sovereign workloads we deploy to appropriate sovereign environments.
Which LLM does it use?
Configurable. Common deployments use OpenAI GPT, Anthropic Claude, or Google Gemini through their AU-region or business-tier endpoints. For workloads that cannot send data to third-party providers, we deploy open-weight models (Llama, Mistral, Qwen) on your own infrastructure via Ollama, vLLM, or Bedrock.
Which business systems can it connect to?
Salesforce, HubSpot, Pipedrive, Microsoft Dynamics, NetSuite, MYOB, Xero, QuickBooks, SAP, SharePoint, Google Workspace, Notion, Confluence, Jira, Linear, Asana, Monday, GitHub, GitLab, Slack, Microsoft Teams, Zendesk, Intercom, Snowflake, BigQuery, Redshift, Postgres, MySQL, S3, Azure Blob, custom REST and GraphQL APIs, and on-premise data sources via secure connectors.
Who can see what data?
Access controls inherit from your existing identity model. A salesperson sees the deals they own, not someone else's pipeline. Finance sees ledger data, not HR records. The Hub never surfaces data the requesting user could not access in the source system.
Does it work for non-technical staff?
Yes. The interface is a chat box. No query language, no dashboard navigation, no menus. The team types plain English questions and gets plain English answers with the underlying source records linked. The whole point is to give non-technical staff direct access to data that previously needed an analyst.
How long does deployment take?
First version live in two to four weeks for a focused scope (three to five systems connected, single department use case). Broader rollout across the business follows in four to twelve weeks depending on system count and access integration.