Hydrix Template

Data API Presets Tutorial

This site was initialized with the Data API presets template for connect-11.

This walkthrough covers pk, pksk, query, s3, custom lambda routes, backend job wiring, and group-protected operations through Admin API.

Client Portal Walkthrough

Open Hydrix Client Portal, go to the Data Resources tab, select this site, and complete each phase in order.

Fast path: sign in at /admin/data-resources and click Connect demo resources. That button provisions all resources and route bindings used below.

Phase 1: Public DynamoDB put/get/query flow

  1. Go to Data Resources and pick this site in Site selection.
  2. Create this resource in the Data resources form.

profileByPk

Partition key name "pk" with type "S". Leave sort key empty.

ordersByPkSk

Partition key "pk" and sort key "sk", both with type "S".

In the Operations form, create these preset routes for the pk-only table:

ddb-put-by-pk

  • Auth: public
  • Target type: Preset
  • Preset: ddb.putByPk
  • Data resource: profileByPk
  • Transform return: raw
  • Payload fields: pk, item
  • Use default payload fields: "pk" for key and "item" for the object payload.

ddb-get-by-pk

  • Auth: public
  • Target type: Preset
  • Preset: ddb.getByPk
  • Data resource: profileByPk
  • Transform return: item
  • Payload fields: pk
  • Leave "Preset args JSON" empty to use the default payload field name "pk".

Then create these preset routes for the pk+sk table:

ddb-put-by-pk-sk

  • Auth: public
  • Target type: Preset
  • Preset: ddb.putByPkSk
  • Data resource: ordersByPkSk
  • Transform return: raw
  • Payload fields: pk, sk, item
  • Use default payload fields for pk, sk, and item.

ddb-get-by-pk-sk

  • Auth: public
  • Target type: Preset
  • Preset: ddb.getByPkSk
  • Data resource: ordersByPkSk
  • Transform return: item
  • Payload fields: pk, sk
  • Use default payload fields for pk and sk.

ddb-query-by-pk

  • Auth: public
  • Target type: Preset
  • Preset: ddb.queryByPk
  • Data resource: ordersByPkSk
  • Transform return: items
  • Payload fields: pk (optional: limit)
  • Use default payload fields for pk and optional limit.
  1. Open the Connection Check section and provide sample item JSON.
  2. Run ddb-put-by-pk to write data.
  3. Run ddb-get-by-pk to read that same data back.
  4. Run ddb-put-by-pk-sk with sample pk + sk.
  5. Run ddb-get-by-pk-sk for that same pk + sk.
  6. Run ddb-query-by-pk to verify query results by pk.

Phase 2: Public S3 upload/download flow

  1. Create S3 resource assetsBucket in the Data resources form.
  2. In Operations, create these two S3 preset routes.
  3. In Connection Check, choose a file and run the upload check.
  4. Then run the download check to confirm the uploaded object is reachable.

s3-presign-put-object

  • Auth: public
  • Target type: Preset
  • Preset: s3.presignPutObject
  • Data resource: assetsBucket
  • Transform return: raw
  • Payload fields: key (optional: contentType, expiresInSeconds)
  • Leave "Preset args JSON" empty to use default payload field names.

s3-presign-get-object

  • Auth: public
  • Target type: Preset
  • Preset: s3.presignGetObject
  • Data resource: assetsBucket
  • Transform return: raw
  • Payload fields: key (optional: expiresInSeconds)
  • Leave "Preset args JSON" empty to use default payload field names.

Phase 3: Custom lambda route + backend job

  1. Create Lambda resource tutorialDataLambda in Data resources.
  2. Set Lambda data sources to profileByPk, ordersByPkSk, and assetsBucket.
  3. In Operations, create lambda route demo-custom-route with auth public.
  4. Create backend job demo-custom-route-job bound to operation demo-custom-route.
  5. Run the custom route check and backend job trigger check from this page.

demo-custom-route

  • Auth: public
  • Target type: Lambda
  • Lambda resource: tutorialDataLambda
  • Payload fields: pk (optional), source (optional)
  • In Operations form select Target type "lambda" and bind tutorialDataLambda.

Lambda Code For tutorialDataLambda

exports.handler = async (event = {}) => {
  const payload = event.payload && typeof event.payload === 'object' ? event.payload : {};
  const bindings = event.dataResourceBindings && typeof event.dataResourceBindings === 'object'
    ? event.dataResourceBindings
    : {};
  const byName = bindings.byName && typeof bindings.byName === 'object' ? bindings.byName : {};
  const hasTable = Boolean(byName['profileByPk']);
  const hasBucket = Boolean(byName['assetsBucket']);
  const hasOrdersTable = Boolean(byName['ordersByPkSk']);
  return {
    ok: true,
    data: {
      message: 'Demo lambda executed',
      source: typeof payload.source === 'string' ? payload.source : 'api-route',
      pk: typeof payload.pk === 'string' ? payload.pk : 'user#tutorial',
      resources: {
        profileByPk: hasTable,
        assetsBucket: hasBucket,
        ordersByPkSk: hasOrdersTable,
      },
      invokedAt: new Date().toISOString(),
    },
  };
};

Public Mode Client Example

Use this while following Phase 1 through Phase 3 with public routes.

import { createHydrixDataPresetClient } from './src/lib/data-api-presets';

const dataApi = createHydrixDataPresetClient({
  baseUrl: '',
  routeScope: 'public',
});

await dataApi.ddbPutByPk({
  pk: 'user#tutorial',
  item: { displayName: 'Tutorial User', plan: 'starter' },
});
const profile = await dataApi.ddbGetByPk({ pk: 'user#tutorial' });
await dataApi.ddbPutByPkSk({
  pk: 'user#tutorial',
  sk: 'order#1001',
  item: { status: 'submitted', total: 42 },
});
const order = await dataApi.ddbGetByPkSk({
  pk: 'user#tutorial',
  sk: 'order#1001',
});
const orders = await dataApi.ddbQueryByPk({ pk: 'user#tutorial', limit: 10 });

const upload = await dataApi.s3PresignPutObject({
  key: 'tutorial/check-connection.txt',
  contentType: 'text/plain',
});
await fetch(upload.url, {
  method: 'PUT',
  headers: { 'content-type': upload.contentType || 'text/plain' },
  body: new Blob(['Hydrix tutorial file'], { type: 'text/plain' }),
});
const download = await dataApi.s3PresignGetObject({ key: 'tutorial/check-connection.txt' });
const downloaded = await fetch(download.url);
const downloadedText = await downloaded.text();

const customRoute = await dataApi.invoke('demo-custom-route', {
  pk: 'user#tutorial',
  source: 'browser-check',
});

Phase 4: Move to authenticated routes

  1. Re-save each operation with Auth = authenticated in the Operations form.
  2. Keep the same operation names, preset IDs, resources, and transform settings.
  3. Update app calls to send a bearer access token.
  4. Use the Visitor Auth Sandbox panel below to sign up/sign in and generate test tokens.

How to get a data access token (tutorial)

This helper uses /data/auth/config and Cognito InitiateAuth withUSER_PASSWORD_AUTH to get an access token for the sample checks.

// Phase 4 tutorial helper: fetch a data-system access token.
// Use this with test/tutorial credentials. For production, use your app's normal sign-in UX.
const authConfigResponse = await fetch('/data/auth/config', { cache: 'no-store' });
const authConfigEnvelope = await authConfigResponse.json();
if (!authConfigResponse.ok || !authConfigEnvelope?.ok) {
  throw new Error('Unable to load data auth config.');
}

const authConfig = authConfigEnvelope.data;
if (!authConfig?.enabled) {
  throw new Error('Data auth is not enabled yet for this site.');
}

const tokenResponse = await fetch('https://cognito-idp.' + authConfig.region + '.amazonaws.com/', {
  method: 'POST',
  headers: {
    'content-type': 'application/x-amz-json-1.1',
    'x-amz-target': 'AWSCognitoIdentityProviderService.InitiateAuth',
  },
  body: JSON.stringify({
    AuthFlow: 'USER_PASSWORD_AUTH',
    ClientId: authConfig.userPoolClientId,
    AuthParameters: {
      USERNAME: 'user@example.com',
      PASSWORD: '<password>',
    },
  }),
});

const tokenBody = await tokenResponse.json();
if (!tokenResponse.ok) {
  throw new Error(tokenBody?.message || 'Failed to acquire access token.');
}

const accessToken = tokenBody?.AuthenticationResult?.AccessToken;
if (!accessToken) {
  throw new Error('Access token missing in Cognito response.');
}

// Use in tutorial client:
// Authorization: 'Bearer ' + accessToken
import { createHydrixDataPresetClient } from './src/lib/data-api-presets';

const accessToken = '<access-token-from-your-auth-flow>';
const dataApi = createHydrixDataPresetClient({
  baseUrl: '',
  routeScope: 'authenticated',
  headers: {
    Authorization: `Bearer ${accessToken}`,
  },
});

await dataApi.ddbPutByPk({
  pk: 'user#tutorial',
  item: { displayName: 'Tutorial User', plan: 'starter' },
});
const profile = await dataApi.ddbGetByPk({ pk: 'user#tutorial' });
await dataApi.ddbPutByPkSk({
  pk: 'user#tutorial',
  sk: 'order#1001',
  item: { status: 'submitted', total: 42 },
});
const order = await dataApi.ddbGetByPkSk({
  pk: 'user#tutorial',
  sk: 'order#1001',
});
const orders = await dataApi.ddbQueryByPk({ pk: 'user#tutorial', limit: 10 });
const customRoute = await dataApi.invoke('demo-custom-route', {
  pk: 'user#tutorial',
  source: 'authenticated-check',
});

Phase 5: Group-protected operation (separate flow)

  1. Create a separate Lambda resource tutorialGroupGuardLambda.
  2. Use this lambda code so only users in group tutorial-verified are allowed.
  3. Create operation demo-group-protected-route with auth authenticated and target that new lambda.
  4. Run the group-protected check button in Connection Check.
  5. If denied, add your data-system user to group tutorial-verified using POST /admin/data/groups/tutorial-verified/users, then rerun.

demo-group-protected-route

  • Auth: authenticated
  • Target type: Lambda
  • Lambda resource: tutorialGroupGuardLambda
  • Payload fields: pk (optional)
  • Keep this route authenticated and validate group "tutorial-verified" inside the lambda.

Group Guard Lambda Code

exports.handler = async (event = {}) => {
  const authContext = event.authContext && typeof event.authContext === 'object' && !Array.isArray(event.authContext)
    ? event.authContext
    : {};
  const groups = Array.isArray(authContext.groups)
    ? authContext.groups
      .filter((entry) => typeof entry === 'string')
      .map((entry) => entry.trim().toLowerCase())
      .filter(Boolean)
    : [];
  const requiredGroup = 'tutorial-verified'.toLowerCase();
  if (!groups.includes(requiredGroup)) {
    return {
      ok: false,
      error: {
        code: 'GroupRequired',
        message: `Operation requires Cognito group "${requiredGroup}".`,
      },
      data: {
        requiredGroup,
        groups,
      },
    };
  }
  return {
    ok: true,
    data: {
      message: 'Group-protected route allowed.',
      requiredGroup,
      groups,
      invokedAt: new Date().toISOString(),
    },
  };
};

Add/Remove Data User Group Membership (Admin API)

const adminToken = '<admin-id-or-access-token>';
const dataUsername = 'user@example.com';

// Add user to required data-system group:
await fetch('/admin/data/groups/tutorial-verified/users', {
  method: 'POST',
  headers: {
    'content-type': 'application/json',
    Authorization: `Bearer ${adminToken}`,
  },
  body: JSON.stringify({ username: dataUsername }),
});

// Remove user from required data-system group:
await fetch('/admin/data/groups/tutorial-verified/users', {
  method: 'DELETE',
  headers: {
    'content-type': 'application/json',
    Authorization: `Bearer ${adminToken}`,
  },
  body: JSON.stringify({ username: dataUsername }),
});

Group-Protected Client Call

const accessToken = '<data-system-access-token>';
const response = await fetch('/data/demo-group-protected-route', {
  method: 'POST',
  headers: {
    'content-type': 'application/json',
    Authorization: `Bearer ${accessToken}`,
  },
  body: JSON.stringify({
    payload: {
      pk: 'user#tutorial',
      source: 'group-check',
    },
  }),
});

const envelope = await response.json();
if (!response.ok) {
  throw new Error(envelope?.error?.message || 'Group route request failed.');
}
if (!envelope?.ok) {
  throw new Error(envelope?.error?.message || 'Group membership check failed.');
}

console.log('Group route result:', envelope.data);

Visitor Auth Sandbox

Use this section to sign up/sign in against the site visitor user pool, inspect token claims, copy an access token for testing, and add/remove the current visitor from the tutorial group.

idleRequired group for Phase 5: tutorial-verified

Load visitor auth config, then sign up/sign in to test authenticated data routes.

Visitor Auth Config

Config not loaded yet.

Token Group Claims

Sign in to load a token.

No group claims in current token.

Current token does not include tutorial-verified.

Group add/remove uses /admin/data/groups/tutorial-verified/users and requires an admin token in the Connection Check section.

Admin Token Helper

Admin API actions in this tutorial require an admin-pool token. Sign in at /admin or /admin/data-resources with an admin/super-admin account, then load the token here.

  1. Sign in to the admin UI in another tab.
  2. Return to this page and click Load admin token from session.
  3. Run backend job trigger and group add/remove checks with that token.

Sign in at /admin (admin pool), then click "Load admin token from session".

Loaded Admin Token

  • Present: false
  • Source: (manual)
  • Expires: (unknown)
  • Username: (unknown)

Admin Group Claims

No groups found in current admin token.

Admin Job Trigger Example

Backend jobs are triggered through Admin API and require an admin-user-pool token.

const adminToken = '<admin-id-or-access-token>';
await fetch('/admin/data/jobs/demo-custom-route-job', {
  method: 'POST',
  headers: {
    'content-type': 'application/json',
    Authorization: `Bearer ${adminToken}`,
  },
  body: JSON.stringify({ source: 'manual-trigger' }),
});

Connection Check

Run each check individually or run all core checks in order. Data route checks use your configured operations. Backend job trigger check and visitor group add/remove actions use Admin API and require an admin token. Group-protected route check is a separate Phase 5 flow.

Selected file: none

ddb-put-by-pk

Writes the sample JSON item to profileByPk using the sample pk.

ddb-get-by-pk

Reads profileByPk by the same pk and confirms the put output is reachable.

ddb-put-by-pk-sk

Writes sample JSON item to ordersByPkSk using sample pk + sk.

ddb-get-by-pk-sk

Reads ordersByPkSk by sample pk + sk and confirms write visibility.

ddb-query-by-pk

Queries ordersByPkSk by sample pk and confirms at least one returned item.

s3-presign-put-object + upload

Generates a PUT URL and uploads your selected file to the sample object key.

s3-presign-get-object + download

Generates a GET URL and downloads the object from the same sample key.

demo-custom-route

Invokes the custom Lambda-backed data route and validates the route response.

POST /admin/data/jobs/demo-custom-route-job

Calls admin API to trigger the backend job that runs against the custom route.

demo-group-protected-route

Calls the authenticated group-guard route; it only passes for users in group "tutorial-verified".

Not checked

Click "Connect demo resources" in /admin/data-resources, then run the checks in order from this page.