GraphQL Clients

In my last post (1), we set up a GraphQL server with Hot Chocolate; in those post, I will show how we can call this server from various clients. First, we make calls from a C# app with Strawberry Shake, a client provided by the Hot Chocolate team; then we will make calls from a React web app with the two popular GraphQL clients Apollo and Relay. When using these clients it is important to remember that they are just making HTTPS (or whatever other transport you decide to use) calls behind the scene; these clients are simply wrappers that provide extra tooling to make your life easier. The server we are building against can be found on GitHub (2).

Strawberry Shake

Writing a Strawberry Shake client against .NET 5+ is very easy; a completed demo can be found at (3). First, we need to install the Strawberry Shake dotnet tools by running dotnet tool install StrawberryShake.Tools --local on the command line. Now we will create our client with dotnet graphql init https://localhost:44377/graphql/ -n LibraryClient -p ./Client Add a namespace property in the created .graphqlrc.json file under the extensions:strawberryShake alongside the url property; this is the namespace the generated client will be placed under. See (4) for more options.

Next, we need to define some queries and mutations for our app to run; place these in various .graphql files inside the newly-created Client folder. When the project is built, these files will be found and compiled into your library using source generators; there will also be a number of files created under a Generated folder to support your editor experience.

fragment BooksPage on AllBooksConnection {
  nodes {
    id
    isbn
    name
  }
}

query App {
  allBooks {
    ...BooksPage
  }
}

mutation CreateUser($username: String!) {
  createUser(name: $username) {
    id name
  }
}

Now that we have our client, we can tie into it through DI. Here is an example of using it with a console app; if it were an ASP.NET Core or Blazor website, you would move the IoC configuration to the ConfigureServices method and inject the ILibraryClient wherever you needed to use it.

static async Task Main(string[] args)
{
    var serviceCollection = new ServiceCollection();

    serviceCollection.AddScoped(sp => new HttpClient { BaseAddress = new Uri("https://localhost:44377") });

    serviceCollection
        .AddLibraryClient()
        .ConfigureHttpClient(client => client.BaseAddress = new Uri("https://localhost:44377/graphql"));

    var serviceProvider = serviceCollection.BuildServiceProvider();
    var client = serviceProvider.GetRequiredService<ILibraryClient>();

    var result = await client.App.ExecuteAsync();
    var data = result.Data;

    data.AllBooks.Nodes[0].MockedField = "test";
    var mockedData = data.AllBooks.Nodes[0].MockedField;

    var createUserResult = await client.CreateUser.ExecuteAsync("abcd");
    var createdUser = createUserResult.Data;
}

Testing

Because Strawberry Shake exposes a partial interface for all generated objects, we can easily mock our client wherever we inject it for use in tests; here is an example mock of our AppQuery call.

[Test]
public void MockAppQuery()
{
    var mockResult = new Mock<IOperationResult<IAppResult>>();
    mockResult.Setup(s => s.Data).Returns(new AppResult(
        new App_AllBooks_AllBooksConnection(new List<App_AllBooks_Nodes_Book>
        {
            new App_AllBooks_Nodes_Book(Guid.NewGuid(), "978-1617294532", "C# In Depth, Fourth Edition")
        })
    ));

    var mockAppQuery = new Mock<IAppQuery>();
    mockAppQuery.Setup(s => s.ExecuteAsync(It.IsAny<CancellationToken>()))
        .ReturnsAsync(mockResult.Object);

    var mockClient = new Mock<ILibraryClient>();
    mockClient.Setup(s => s.App).Returns(mockAppQuery.Object);

    // todo: act

    // todo: assert
}

When to Use

Use this client if you are calling a GraphQL API from a C# client; this could be if your backend calls Shopify, GitHub, or other GraphQL server, or if you are developing a Blazor website or WPF, Xamarin, or .NET Maui app.

Apollo

Apollo is the most popular GraphQL client for JS-based apps because of its ease of use; to achieve this, however, it does not enforce some of the ideals of GraphQL. These are discussed in more detail in the When to Use section. A completed demo based on the npx create-react-app library --template typescript template can be found at (5). To use this client, first, we need to install the latest @apollo/client and graphql packages with your package manager of choice; I use npm, since I am most familiar with it: npm i @apollo/client graphql Next, we need to create an ApolloClient instance; this client will tell our app where the graphql server lives and how to talk to it, and how to cache data. Put this in src/client.ts (This step, and many of the following steps, are demonstrated in the Apollo get-started docs at (6).)

import {
  ApolloClient,
  ApolloLink,
  HttpLink,
  InMemoryCache,
} from '@apollo/client'
import env from './env'

const httpLink = new HttpLink({
  uri: env.GraphQLEndpoint,
})

export const client = new ApolloClient({
  cache: new InMemoryCache(),
  link: ApolloLink.from([httpLink]),
})

Example environment variable configuration, which I like to place at src/env.ts:

const env = {
  GraphQLEndpoint: process.env.GRAPHQL_ENDPOINT || 'https://localhost:44377/graphql/'
}

export default env

Next, we will configure our GraphQL codegen system. Because we are using TypeScript and React, we would like to have our queries and mutations strongly typed and use hooks to perform our calls. While Apollo has a codegen system you can import, it has many bugs and I have not been able to get it working satifactorily; I prefer the @graphql-codegen library. To use this library, we will install our packages with npm i --save-dev @graphql-codegen/cli @graphql-codegen/typescript @graphql-codegen/typescript-react-apollo @graphql-codegen/typescript-operations Next, we will copy our schema, which can typically be found by using the server’s provided introspection, to data/schema.graphql and create a codegen.yml file at the root of the project:

schema: ./data/schema.graphql
documents: 'src/**/*.tsx'
generates:
  src/types-and-hooks.ts:
    plugins:
      - typescript
      - typescript-operations
      - typescript-react-apollo

Note that we tell it where our schema is (this could also be a URL pointing to a live server, which is useful when the schema is in active development), which documents to scan for scripts (you can also point it to .graphql files, if you would rather not have your queries and mutations in your .tsx files), where to put the generated code, and which plugins to use when generating it. Once we add "graphql:codegen": "graphql-codegen" to the scripts section of the package.json and run it, we will be able to write queries and use them with hooks in our React components. The best part about this library is if we were using Angular instead, for example, we could have used the typescript-apollo-angular plugin and nothing else would change around integrating with GraphQL other than the final usage (e.g. we would access it with dependency injection instead of hooks). Additional configuration options can be found at (7).

The next couple paragraphs are mostly React-focused; Apollo is not limited to working with React, so you can ignore the React-specific pieces if you are using a different frontend library.

Now we configure our ApolloProvider with an instance of our client in our App file. Now everything is set up, and we can use the generated hooks in our React components.

App.tsx
import { ApolloProvider, gql } from '@apollo/client'
import { useAppQuery } from './types-and-hooks'
import { client } from './client'
import BooksPage from './BooksPage'
import CreateUser from './CreateUser'

gql`
  query App {
    allBooks {
      ...BooksPage
    }
  }
`

// exported so we can access it for testing
export function App() {
  const { data, loading, error } = useAppQuery()

  if (error) {
    return <div>Error!</div>
  }

  if (loading) {
    return <div>Loading...</div>
  }

  return (
    <div className="App container">
      <CreateUser />
      <BooksPage data={data?.allBooks} />
    </div>
  )
}

function AppRoot() {
  return (
    <ApolloProvider client={client}>
      <App />
    </ApolloProvider>
  )
}

export default AppRoot
BooksPage.tsx
import { gql } from '@apollo/client'
import { BooksPageFragment } from './types-and-hooks'

gql`
  fragment BooksPage on AllBooksConnection {
    nodes {
      id
      isbn
      name
    }
  }
`

interface Props {
  query: BooksPageFragment | null
}

function BooksPage({ query }: Props) {
  return (
    <div className="BooksPage">
      {query?.nodes?.map((m) => (
        <div key={m.id}>
          {m.name} - {m.isbn}
        </div>
      ))}
    </div>
  )
}

export default BooksPage
CreateUser.tsx
import { gql } from '@apollo/client'
import { useCreateUserMutation } from './types-and-hooks'

gql`
  mutation CreateUser($username: String!) {
    createUser(name: $username) {
      id
      name
    }
  }
`

function CreateUser() {
  const [command] = useCreateUserMutation()

  const createUser = () =>
    command({
      variables: {
        username: 'asdf',
      },
    })

  return (
    <div className="CreateUser">
      <button style={{ float: 'right' }} onClick={createUser}>
        Create User
      </button>
    </div>
  )
}

export default CreateUser

Note: that the final solution I shared differs slightly so it runs against the final version of the server after we update it to follow Relay conventions (the GraphQL gold standard client) at the end of this file. The way I wrote it here reflects the state of the server at the end of my previous post (1).

Testing

Apollo can be tested very easily. First, we need to create an array of mock responses. Each mock will specify which query the mock is for and what data is returned. We will now pass these mocks to Apollo’s MockedProvider If we do not trigger our UI updates to process with an await act call, it is in the loading state; after that, it either sets the data or error depending what our mock returned. If we were not using fragments, we could set addTypename={false} on the MockedProvider and leave out the __typename fields in our mocks to make things simpler.

import { AppDocument, BooksPageFragment } from './types-and-hooks'
import { App } from './App'
import { MockedProvider, MockedResponse } from '@apollo/client/testing'
import { screen, render, act } from '@testing-library/react'

const mocks: MockedResponse[] = [
  {
    request: {
      query: AppDocument,
    },
    result: {
      data: {
        allBooks: {
          __typename: 'AllBooksConnection',
          nodes: [
            {
              __typename: 'Book',
              id: 1,
              isbn: '978-1617294532',
              name: 'C# In Depth, Fourth Edition',
            },
            {
              __typename: 'Book',
              id: 2,
              isbn: '978-1617295683',
              name: 'GraphQL in Action',
            },
          ] as BooksPageFragment,
        },
      },
    },
  },
]

const errMocks: MockedResponse[] = [
  {
    request: {
      query: AppDocument,
    },
    error: new Error('An error occurred'),
  },
]

it('renders loading state', () => {
  render(
    <MockedProvider mocks={mocks}>
      <App />
    </MockedProvider>,
  )

  const domPiece = screen.getByText('Loading...')
  expect(domPiece).toBeInTheDocument()
})

it('renders book list', async () => {
  render(
    <MockedProvider mocks={mocks}>
      <App />
    </MockedProvider>,
  )

  await act(async () => await new Promise((resolve) => setTimeout(resolve, 0)))

  const domPiece = screen.getByText('C# In Depth, Fourth Edition - 978-1617294532')
  expect(domPiece).toBeInTheDocument()
})

it('renders error state', async () => {
  render(
    <MockedProvider mocks={errMocks}>
      <App />
    </MockedProvider>,
  )

  await act(async () => await new Promise((resolve) => setTimeout(resolve, 0)))

  const domPiece = screen.getByText('Error!')
  expect(domPiece).toBeInTheDocument()
})

Caching

Apollo comes with a built-in cache to help minimize network calls. It populates items that it can build a cache id for when you perform a query, and will update known items with the response from a mutation; it will not insert new items from a mutation’s response, however. By default, cache items are generated using the __typename and id or _id field, but this can be customized by setting the typePolicies For example, if I wanted to use the isbn field instead of the id field as my cache key (the __typename field is always used), I could use this:

const cache = new InMemoryCache({
  typePolicies: {
    Book: {
      keyFields: ["isbn"],
    },
  },
});

An excellent discussion of cache manipulation can be found at (8).

When to Use

I prefer this framework when building a non-React JS frontend, such as an Angular UI, but be careful to not treat each query endpoint as a single call, like a REST API would, and be careful when choosing your cache strategy to keep your UI quick and responsive while still displaying the correct information.

Relay

Relay is a GraphQL client built for React by Facebook. It is a little more confusing to learn than Apollo, partially because it heavily relies on fragments, rather than simply building and making the calls you need directly. Its benefit, however, is that each component declares which fields of which types it needs, and the app makes a single call to the server when it loads or navigates to a new page. A completed demo based on the npx create-react-app library --template typescript template can be found at (9).

First, we need to install the required dependencies with npm i relay-runtime react-relay and npm i --save-dev relay-compiler babel-plugin-relay @types/relay-runtime @types/react-relay Then add a section to the package.json to call the relay-compiler tool and some configuration values so it can generate the code correctly. Other configuration parameters can be found at (10).

"scripts": {
  "relay": "relay-compiler"
},
"relay": {
  "src": "./src",
  "schema": "./data/schema.graphql",
  "language": "typescript"
},

Now that we have Relay installed, we need the schema to the API our app is querying; this can typically be found by using the server’s provided introspection, unless it is not published, in which case you probably are not supposed to be calling the server. Once you have this, place it in your project at data/schema.graphql

Next, we will declare a TypeScript definition so we can use it without compiler errors; I placed this in src/types.d.ts

declare module 'babel-plugin-relay/macro' {
  export { graphql } from 'react-relay'
}

Now we will set up our environment variables at src/env.ts:

const env = {
  GraphQLEndpoint: process.env.GRAPHQL_ENDPOINT || 'https://localhost:44377/graphql/'
}

export default env

Next we have to write a couple tools for Relay to tie into the server with; I put this at src/relay-env.ts

import {
    Environment,
    Network,
    RecordSource,
    RequestParameters,
    Store,
    Variables
  } from 'relay-runtime'
import env from './env'

const url = env.GraphQLEndpoint

function fetchQuery(
  operation: RequestParameters,
  variables: Variables,
) {
  return fetch(url, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({
      query: operation.text,
      variables,
    }),
  }).then(response => {
    return response.json();
  });
}

const environment = new Environment({
  network: Network.create(fetchQuery),
  store: new Store(new RecordSource()),
});

export default environment;

Finally, we are ready to write some React components. First, we will write our App function:

import { Environment, QueryRenderer } from 'react-relay'
import defaultEnvironment from './relay-env'
import type {
  App_Query,
  App_Query$data,
} from './__generated__/App_Query.graphql'
import { graphql } from 'babel-plugin-relay/macro'
import BooksPage from './BooksPage'
import CreateUser from './CreateUser'

const query = graphql`
  query App_Query {
    allBooks {
      ...BooksPage_query
    }
  }
`

interface Props {
  error: Error | null
  props: App_Query$data | null
}

export function App({ error, props }: Props) {
  if (error) {
    return <div>Error!</div>
  }

  if (!props) {
    return <div>Loading...</div>
  }

  return (
    <div className="App container">
      <CreateUser />
      <BooksPage query={props.allBooks} />
    </div>
  )
}

export interface AppRootProps {
  environment?: Environment
}

function AppRoot({ environment }: AppRootProps) {
  // note: QueryRenderer<App_Query> is actually correct; it's a generic type that uses a Babel plugin like the graphql`` tags
  return (
    <QueryRenderer<App_Query>
      environment={environment ?? defaultEnvironment}
      query={query}
      render={(renderProps) => <App {...renderProps} />}
      variables={{}}
    />
  )
}

export default AppRoot

Tip: to get this working, first write your graphql tags for this and the below components, run npm run relay to generate the __generated folder, then write the rest of the code. If you try to paste this content as-is, npm run relay will not work because it is referencing missing components.

The query is the root query of the app. Unlike Apollo, which works similar to a REST API where you can make many queries as you go, Relay only has one root query that pulls the data for all child components. Note that we are using the TypeScript graphql definition we declared above; this is how the relay-compiler tool determines the GraphQL scripts it needs to generate. Here, we declare our root query with any arguments required, specify which object we are querying, and reference the fragment declared by a child component. Note also the genericly-typed QueryRenderer ; this gives the type system the info it needs to type the render and variables props.

Our App is just a React component; because it is our root render node within the QueryRenderer with arguments to represent query errors and props, which are just the query response nodes. Here, we are passing the props.allBooks piece into our child component.

Next, update the index.tsx file to call AppRoot instead of App and pass it the environment we defined in the relay-env file:

GraphQL Server with Hot Chocolate

In my last post (1), we set up a GraphQL server with Hot Chocolate; in those post, I will show how we can call this server from various clients. First, we make calls from a C# app with Strawberry Shake, a client provided by the Hot Chocolate team; then we will make calls from a React web app with the two popular GraphQL clients Apollo and Relay. When using these clients it is important to remember that they are just making HTTPS (or whatever other transport you decide to use) calls behind the scene; these clients are simply wrappers that provide extra tooling to make your life easier. The server we are building against can be found on GitHub (2).

Strawberry Shake

Writing a Strawberry Shake client against .NET 5+ is very easy; a completed demo can be found at (3). First, we need to install the Strawberry Shake dotnet tools by running dotnet tool install StrawberryShake.Tools --local on the command line. Now we will create our client with dotnet graphql init https://localhost:44377/graphql/ -n LibraryClient -p ./Client Add a namespace property in the created .graphqlrc.json file under the extensions:strawberryShake alongside the url property; this is the namespace the generated client will be placed under. See (4) for more options.

Next, we need to define some queries and mutations for our app to run; place these in various .graphql files inside the newly-created Client folder. When the project is built, these files will be found and compiled into your library using source generators; there will also be a number of files created under a Generated folder to support your editor experience.

fragment BooksPage on AllBooksConnection {
  nodes {
    id
    isbn
    name
  }
}

query App {
  allBooks {
    ...BooksPage
  }
}

mutation CreateUser($username: String!) {
  createUser(name: $username) {
    id name
  }
}

Now that we have our client, we can tie into it through DI. Here is an example of using it with a console app; if it were an ASP.NET Core or Blazor website, you would move the IoC configuration to the ConfigureServices method and inject the ILibraryClient wherever you needed to use it.

static async Task Main(string[] args)
{
    var serviceCollection = new ServiceCollection();

    serviceCollection.AddScoped(sp => new HttpClient { BaseAddress = new Uri("https://localhost:44377") });

    serviceCollection
        .AddLibraryClient()
        .ConfigureHttpClient(client => client.BaseAddress = new Uri("https://localhost:44377/graphql"));

    var serviceProvider = serviceCollection.BuildServiceProvider();
    var client = serviceProvider.GetRequiredService<ILibraryClient>();

    var result = await client.App.ExecuteAsync();
    var data = result.Data;

    data.AllBooks.Nodes[0].MockedField = "test";
    var mockedData = data.AllBooks.Nodes[0].MockedField;

    var createUserResult = await client.CreateUser.ExecuteAsync("abcd");
    var createdUser = createUserResult.Data;
}

Testing

Because Strawberry Shake exposes a partial interface for all generated objects, we can easily mock our client wherever we inject it for use in tests; here is an example mock of our AppQuery call.

[Test]
public void MockAppQuery()
{
    var mockResult = new Mock<IOperationResult<IAppResult>>();
    mockResult.Setup(s => s.Data).Returns(new AppResult(
        new App_AllBooks_AllBooksConnection(new List<App_AllBooks_Nodes_Book>
        {
            new App_AllBooks_Nodes_Book(Guid.NewGuid(), "978-1617294532", "C# In Depth, Fourth Edition")
        })
    ));

    var mockAppQuery = new Mock<IAppQuery>();
    mockAppQuery.Setup(s => s.ExecuteAsync(It.IsAny<CancellationToken>()))
        .ReturnsAsync(mockResult.Object);

    var mockClient = new Mock<ILibraryClient>();
    mockClient.Setup(s => s.App).Returns(mockAppQuery.Object);

    // todo: act

    // todo: assert
}

When to Use

Use this client if you are calling a GraphQL API from a C# client; this could be if your backend calls Shopify, GitHub, or other GraphQL server, or if you are developing a Blazor website or WPF, Xamarin, or .NET Maui app.

Apollo

Apollo is the most popular GraphQL client for JS-based apps because of its ease of use; to achieve this, however, it does not enforce some of the ideals of GraphQL. These are discussed in more detail in the When to Use section. A completed demo based on the npx create-react-app library --template typescript template can be found at (5). To use this client, first, we need to install the latest @apollo/client and graphql packages with your package manager of choice; I use npm, since I am most familiar with it: npm i @apollo/client graphql Next, we need to create an ApolloClient instance; this client will tell our app where the graphql server lives and how to talk to it, and how to cache data. Put this in src/client.ts (This step, and many of the following steps, are demonstrated in the Apollo get-started docs at (6).)

import {
  ApolloClient,
  ApolloLink,
  HttpLink,
  InMemoryCache,
} from '@apollo/client'
import env from './env'

const httpLink = new HttpLink({
  uri: env.GraphQLEndpoint,
})

export const client = new ApolloClient({
  cache: new InMemoryCache(),
  link: ApolloLink.from([httpLink]),
})

Example environment variable configuration, which I like to place at src/env.ts:

const env = {
  GraphQLEndpoint: process.env.GRAPHQL_ENDPOINT || 'https://localhost:44377/graphql/'
}

export default env

Next, we will configure our GraphQL codegen system. Because we are using TypeScript and React, we would like to have our queries and mutations strongly typed and use hooks to perform our calls. While Apollo has a codegen system you can import, it has many bugs and I have not been able to get it working satifactorily; I prefer the @graphql-codegen library. To use this library, we will install our packages with npm i --save-dev @graphql-codegen/cli @graphql-codegen/typescript @graphql-codegen/typescript-react-apollo @graphql-codegen/typescript-operations Next, we will copy our schema, which can typically be found by using the server’s provided introspection, to data/schema.graphql and create a codegen.yml file at the root of the project:

schema: ./data/schema.graphql
documents: 'src/**/*.tsx'
generates:
  src/types-and-hooks.ts:
    plugins:
      - typescript
      - typescript-operations
      - typescript-react-apollo

Note that we tell it where our schema is (this could also be a URL pointing to a live server, which is useful when the schema is in active development), which documents to scan for scripts (you can also point it to .graphql files, if you would rather not have your queries and mutations in your .tsx files), where to put the generated code, and which plugins to use when generating it. Once we add "graphql:codegen": "graphql-codegen" to the scripts section of the package.json and run it, we will be able to write queries and use them with hooks in our React components. The best part about this library is if we were using Angular instead, for example, we could have used the typescript-apollo-angular plugin and nothing else would change around integrating with GraphQL other than the final usage (e.g. we would access it with dependency injection instead of hooks). Additional configuration options can be found at (7).

The next couple paragraphs are mostly React-focused; Apollo is not limited to working with React, so you can ignore the React-specific pieces if you are using a different frontend library.

Now we configure our ApolloProvider with an instance of our client in our App file. Now everything is set up, and we can use the generated hooks in our React components.

App.tsx
import { ApolloProvider, gql } from '@apollo/client'
import { useAppQuery } from './types-and-hooks'
import { client } from './client'
import BooksPage from './BooksPage'
import CreateUser from './CreateUser'

gql`
  query App {
    allBooks {
      ...BooksPage
    }
  }
`

// exported so we can access it for testing
export function App() {
  const { data, loading, error } = useAppQuery()

  if (error) {
    return <div>Error!</div>
  }

  if (loading) {
    return <div>Loading...</div>
  }

  return (
    <div className="App container">
      <CreateUser />
      <BooksPage data={data?.allBooks} />
    </div>
  )
}

function AppRoot() {
  return (
    <ApolloProvider client={client}>
      <App />
    </ApolloProvider>
  )
}

export default AppRoot
BooksPage.tsx
import { gql } from '@apollo/client'
import { BooksPageFragment } from './types-and-hooks'

gql`
  fragment BooksPage on AllBooksConnection {
    nodes {
      id
      isbn
      name
    }
  }
`

interface Props {
  query: BooksPageFragment | null
}

function BooksPage({ query }: Props) {
  return (
    <div className="BooksPage">
      {query?.nodes?.map((m) => (
        <div key={m.id}>
          {m.name} - {m.isbn}
        </div>
      ))}
    </div>
  )
}

export default BooksPage
CreateUser.tsx
import { gql } from '@apollo/client'
import { useCreateUserMutation } from './types-and-hooks'

gql`
  mutation CreateUser($username: String!) {
    createUser(name: $username) {
      id
      name
    }
  }
`

function CreateUser() {
  const [command] = useCreateUserMutation()

  const createUser = () =>
    command({
      variables: {
        username: 'asdf',
      },
    })

  return (
    <div className="CreateUser">
      <button style={{ float: 'right' }} onClick={createUser}>
        Create User
      </button>
    </div>
  )
}

export default CreateUser

Note: that the final solution I shared differs slightly so it runs against the final version of the server after we update it to follow Relay conventions (the GraphQL gold standard client) at the end of this file. The way I wrote it here reflects the state of the server at the end of my previous post (1).

Testing

Apollo can be tested very easily. First, we need to create an array of mock responses. Each mock will specify which query the mock is for and what data is returned. We will now pass these mocks to Apollo’s MockedProvider If we do not trigger our UI updates to process with an await act call, it is in the loading state; after that, it either sets the data or error depending what our mock returned. If we were not using fragments, we could set addTypename={false} on the MockedProvider and leave out the __typename fields in our mocks to make things simpler.

import { AppDocument, BooksPageFragment } from './types-and-hooks'
import { App } from './App'
import { MockedProvider, MockedResponse } from '@apollo/client/testing'
import { screen, render, act } from '@testing-library/react'

const mocks: MockedResponse[] = [
  {
    request: {
      query: AppDocument,
    },
    result: {
      data: {
        allBooks: {
          __typename: 'AllBooksConnection',
          nodes: [
            {
              __typename: 'Book',
              id: 1,
              isbn: '978-1617294532',
              name: 'C# In Depth, Fourth Edition',
            },
            {
              __typename: 'Book',
              id: 2,
              isbn: '978-1617295683',
              name: 'GraphQL in Action',
            },
          ] as BooksPageFragment,
        },
      },
    },
  },
]

const errMocks: MockedResponse[] = [
  {
    request: {
      query: AppDocument,
    },
    error: new Error('An error occurred'),
  },
]

it('renders loading state', () => {
  render(
    <MockedProvider mocks={mocks}>
      <App />
    </MockedProvider>,
  )

  const domPiece = screen.getByText('Loading...')
  expect(domPiece).toBeInTheDocument()
})

it('renders book list', async () => {
  render(
    <MockedProvider mocks={mocks}>
      <App />
    </MockedProvider>,
  )

  await act(async () => await new Promise((resolve) => setTimeout(resolve, 0)))

  const domPiece = screen.getByText('C# In Depth, Fourth Edition - 978-1617294532')
  expect(domPiece).toBeInTheDocument()
})

it('renders error state', async () => {
  render(
    <MockedProvider mocks={errMocks}>
      <App />
    </MockedProvider>,
  )

  await act(async () => await new Promise((resolve) => setTimeout(resolve, 0)))

  const domPiece = screen.getByText('Error!')
  expect(domPiece).toBeInTheDocument()
})

Caching

Apollo comes with a built-in cache to help minimize network calls. It populates items that it can build a cache id for when you perform a query, and will update known items with the response from a mutation; it will not insert new items from a mutation’s response, however. By default, cache items are generated using the __typename and id or _id field, but this can be customized by setting the typePolicies For example, if I wanted to use the isbn field instead of the id field as my cache key (the __typename field is always used), I could use this:

const cache = new InMemoryCache({
  typePolicies: {
    Book: {
      keyFields: ["isbn"],
    },
  },
});

An excellent discussion of cache manipulation can be found at (8).

When to Use

I prefer this framework when building a non-React JS frontend, such as an Angular UI, but be careful to not treat each query endpoint as a single call, like a REST API would, and be careful when choosing your cache strategy to keep your UI quick and responsive while still displaying the correct information.

Relay

Relay is a GraphQL client built for React by Facebook. It is a little more confusing to learn than Apollo, partially because it heavily relies on fragments, rather than simply building and making the calls you need directly. Its benefit, however, is that each component declares which fields of which types it needs, and the app makes a single call to the server when it loads or navigates to a new page. A completed demo based on the npx create-react-app library --template typescript template can be found at (9).

First, we need to install the required dependencies with npm i relay-runtime react-relay and npm i --save-dev relay-compiler babel-plugin-relay @types/relay-runtime @types/react-relay Then add a section to the package.json to call the relay-compiler tool and some configuration values so it can generate the code correctly. Other configuration parameters can be found at (10).

"scripts": {
  "relay": "relay-compiler"
},
"relay": {
  "src": "./src",
  "schema": "./data/schema.graphql",
  "language": "typescript"
},

Now that we have Relay installed, we need the schema to the API our app is querying; this can typically be found by using the server’s provided introspection, unless it is not published, in which case you probably are not supposed to be calling the server. Once you have this, place it in your project at data/schema.graphql

Next, we will declare a TypeScript definition so we can use it without compiler errors; I placed this in src/types.d.ts

declare module 'babel-plugin-relay/macro' {
  export { graphql } from 'react-relay'
}

Now we will set up our environment variables at src/env.ts:

const env = {
  GraphQLEndpoint: process.env.GRAPHQL_ENDPOINT || 'https://localhost:44377/graphql/'
}

export default env

Next we have to write a couple tools for Relay to tie into the server with; I put this at src/relay-env.ts

import {
    Environment,
    Network,
    RecordSource,
    RequestParameters,
    Store,
    Variables
  } from 'relay-runtime'
import env from './env'

const url = env.GraphQLEndpoint

function fetchQuery(
  operation: RequestParameters,
  variables: Variables,
) {
  return fetch(url, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({
      query: operation.text,
      variables,
    }),
  }).then(response => {
    return response.json();
  });
}

const environment = new Environment({
  network: Network.create(fetchQuery),
  store: new Store(new RecordSource()),
});

export default environment;

Finally, we are ready to write some React components. First, we will write our App function:

import { Environment, QueryRenderer } from 'react-relay'
import defaultEnvironment from './relay-env'
import type {
  App_Query,
  App_Query$data,
} from './__generated__/App_Query.graphql'
import { graphql } from 'babel-plugin-relay/macro'
import BooksPage from './BooksPage'
import CreateUser from './CreateUser'

const query = graphql`
  query App_Query {
    allBooks {
      ...BooksPage_query
    }
  }
`

interface Props {
  error: Error | null
  props: App_Query$data | null
}

export function App({ error, props }: Props) {
  if (error) {
    return <div>Error!</div>
  }

  if (!props) {
    return <div>Loading...</div>
  }

  return (
    <div className="App container">
      <CreateUser />
      <BooksPage query={props.allBooks} />
    </div>
  )
}

export interface AppRootProps {
  environment?: Environment
}

function AppRoot({ environment }: AppRootProps) {
  // note: QueryRenderer<App_Query> is actually correct; it's a generic type that uses a Babel plugin like the graphql`` tags
  return (
    <QueryRenderer<App_Query>
      environment={environment ?? defaultEnvironment}
      query={query}
      render={(renderProps) => <App {...renderProps} />}
      variables={{}}
    />
  )
}

export default AppRoot

Tip: to get this working, first write your graphql tags for this and the below components, run npm run relay to generate the __generated folder, then write the rest of the code. If you try to paste this content as-is, npm run relay will not work because it is referencing missing components.

The query is the root query of the app. Unlike Apollo, which works similar to a REST API where you can make many queries as you go, Relay only has one root query that pulls the data for all child components. Note that we are using the TypeScript graphql definition we declared above; this is how the relay-compiler tool determines the GraphQL scripts it needs to generate. Here, we declare our root query with any arguments required, specify which object we are querying, and reference the fragment declared by a child component. Note also the genericly-typed QueryRenderer ; this gives the type system the info it needs to type the render and variables props.

Our App is just a React component; because it is our root render node within the QueryRenderer with arguments to represent query errors and props, which are just the query response nodes. Here, we are passing the props.allBooks piece into our child component.

Next, update the index.tsx file to call AppRoot instead of App and pass it the environment we defined in the relay-env file:

root.render(
  <React.StrictMode>
    <AppRoot />
  </React.StrictMode>
);

Now we will look at the BooksPage component. Here, we define our React component with a query argument, which is the GraphQL query result; note that we cannot just access anything from the query; Relay checks that we only reference what we declare we are referencing and throws an error if we try to reference a field a different component requested. For example, we could add the field publishedOn to the query in the parent component: publishedOn ...BooksPage_query If we then cast query to an any and tried to reference query.publishedOn inside the BooksPage component, we would get a runtime error; the interface is just defined so it will be fully typed. Note that the type of the query property is BooksPage_query$key , similar to our fragment name; the relay-compiler generates a type for us from our fragment.

import { useFragment } from 'react-relay'
import { BooksPage_query$key } from './__generated__/BooksPage_query.graphql'
import { graphql } from 'babel-plugin-relay/macro'
interface Props {
  query: BooksPage_query$key | null
}
function BooksPage({ query }: Props) {
  const data = useFragment(
    graphql`
      fragment BooksPage_query on AllBooksConnection {
        nodes {
          id
          isbn
          name
        }
      }
    `,
    query,
  )
  return (
    <div className="BooksPage">
      {data?.nodes?.map((m) => (
        <div key={m.id}>
          {m.name} - {m.isbn}
        </div>
      ))}
    </div>
  )
}
export default BooksPage

Our CreateUser component is simply a button that calls a mutation to create a user with a hard-coded username when it is clicked (more details on advanced usage of mutations can be found at (11)):

import { useMutation } from 'react-relay'
import { graphql } from 'babel-plugin-relay/macro'
function CreateUser() {
  const [command] = useMutation(graphql`
    mutation CreateUserMutation($username: String!) {
      createUser(name: $username) {
        id
        name
      }
    }
  `)
  const createUser = () =>
    command({
      variables: {
        username: 'asdf',
      },
    })
  return (
    <div className="CreateUser">
      <button style={{ float: 'right' }} onClick={createUser}>
        Create User
      </button>
    </div>
  )
}
export default CreateUser

Routing

Because Relay only does one query per route and you do not control where or when the query happens, we need to consider our routing. For web apps with a single route or a flat routing structure, we can simply use a QueryRenderer on each rendered page-level component as I did above. If our app had a tree of routes, and I used a QueryRenderer on each route-level component, Relay would be unable to render the child routes until the parent route data request was resolved and the component was rendered, leading to delays as discussed at (12). We can resolve this by using the Found router and Found-Relay to perform all the data requests simultaneously. This will be left as an exercise for the reader.

Testing

Testing with Relay is quite simple, although slightly different depending on whether we are testing a root component or a component that consumes a fragment. To test our root AppRoot component, we update it to take an optional environment prop, then write our tests:

import App from './App'
import { createMockEnvironment, MockPayloadGenerator } from 'relay-test-utils'
import ReactTestRenderer from 'react-test-renderer'
test('Loading State', () => {
  const environment = createMockEnvironment()
  const renderer = ReactTestRenderer.create(
    <App environment={environment} />,
  )
  expect(
    renderer.root.find(node => node.children[0] === 'Loading...'),
  ).toBeDefined()
})
test('Data Render', () => {
  const environment = createMockEnvironment()
  const renderer = ReactTestRenderer.create(
    <App environment={environment} />,
  )
  ReactTestRenderer.act(() => {
    environment.mock.resolveMostRecentOperation(operation =>
      MockPayloadGenerator.generate(operation),
    )
  })
  expect(
    renderer.root.find(node => node.props.className === 'BooksPage'),
  ).toBeDefined()
})
test('Error State', () => {
  const environment = createMockEnvironment()
  const renderer = ReactTestRenderer.create(
    <App environment={environment} />,
  )
  ReactTestRenderer.act(() => {
    environment.mock.rejectMostRecentOperation(new Error('Uh-oh'))
  })
  expect(
    renderer.root.find(node => node.children[0] === 'Error!'),
  ).toBeDefined()
})

To test our BooksPage component that takes a fragment (remember to run the relay compiler again to pick up the new graphql query):

import BooksPage from './BooksPage'
import { createMockEnvironment, MockPayloadGenerator } from 'relay-test-utils'
import ReactTestRenderer from 'react-test-renderer'
import { RelayEnvironmentProvider, useLazyLoadQuery } from 'react-relay'
import { graphql } from 'babel-plugin-relay/macro'
import { Suspense } from 'react'
import { BooksPage_TestQuery } from './__generated__/BooksPage_TestQuery.graphql'
test('Renders book names', () => {
  const environment = createMockEnvironment()
  environment.mock.queueOperationResolver((operation) =>
    MockPayloadGenerator.generate(operation, {
      AllBooksConnection() {
        return {
          __typename: 'AllBooksConnection',
          nodes: [
            {
              id: '1',
              isbn: '123-123456789',
              name: 'Book 1',
            },
            {
              id: '2',
              isbn: '987-123456789',
              name: 'Book 2',
            },
          ],
        }
      },
    }),
  )
  const TestRenderer = () => {
    const data = useLazyLoadQuery<BooksPage_TestQuery>(
      graphql`
        query BooksPage_TestQuery @relay_test_operation {
          allBooks {
            ...BooksPage_query
          }
        }
      `,
      {},
    )
    return <BooksPage query={data.allBooks} />
  }
  const renderer = ReactTestRenderer.create(
    <RelayEnvironmentProvider environment={environment}>
      <Suspense fallback="Loading...">
        <TestRenderer />
      </Suspense>
    </RelayEnvironmentProvider>,
  )
  
  expect(renderer).toMatchSnapshot()
})

Additional details can be found at (13).

When to Use

This is my preferred client for React frontends because it is easy to use and strongly encourages the correct pattern of a single query per defined route. However, there are a few catches you have to watch for because it is the most opinionated of all the clients. First, the server must provide a way to refetch any node given an id, and second, the server must provide a way to page through connections (14), which is is typically done by implementing the following spec. Note that while my server-side blog post covers the second, it does not support the first; see the next section for how we update our server to fully support Relay.

interface Node {
  id: ID!
}
type PageInfo {
  hasNextPage: Boolean!
  hasPreviousPage: Boolean!
  startCursor: String
  endCursor: String
}
""" Example connection type """
type MyConnection {
  edges: [MyEdge]
  pageInfo: PageInfo!
}
type Query {
  node(id: ID!): Node
}

Updating the Server

Hot Chocolate provides excellent support for Relay; however, I did not fully implement that in my last post. I did use their paging support where applicable, so we do not need change that; we do need to implement the universal ID, however. First, we will add .AddGlobalObjectIdentification() to our services.AddGraphQLServer() chain where we register our queries and mutations; this adds the middlware to convert our ids back and forth. Next, we will add the HotChocolate.Types.Relay.NodeAttribute attribute to our response type; this tells Hot Chocolate which pieces implement the Node interface. If our type did not have an Id property, we would need to put the IDAttribute on our id property, but we do not need to do this here because I followed their naming conventions. Next, we need to add a static method on each type that implements the Node interface; this method will take an id parameter of the same type as our Id property, as many [Service] parameters as we need, and will return either a T or Task , where T is the node type. This method should be named either Get , GetAsync , Get{T} , or Get{T}Async , where {T} is the name of the node type (e.g. GetBookAsync ); if we do not follow the naming convention, we can set the NodeResolverAttribute on the method. This is my implementation for the Book type:

public static async Task<Book?> GetAsync(Guid id, [Service] IBookApplication bookApp)
{
    return await bookApp.Get(id);
}
Finally, we need to apply the [ID] attribute to our queries that take an Id parameter in as well:

public IExecutable<Book> GetBook(
    [Service] IMongoCollection<Book> collection, [ID] Guid id)
{
    if (!featureFlags.EnableBook)
    {
        throw new QueryException("Query not implemented");
    }
    return collection.Find(x => x.Id == id).AsExecutable();
}

Now our server queries are Relay-compliant with minimal work on our end. The final thing we need to do is instead of accepting a list of parameters into and returning our query objects directly from our mutations, we need to use Input and Payload parameters in our mutations as shown in the following schema definition:

# Old
type Mutation {
  checkoutBook(userId: UUID!, bookId: UUID!): User
}
# New
input CheckoutBookInput {
  userId: UUID!
  bookId: UUID!
}
type CheckoutBookPayload {
  user: User
}
type Mutation {
  checkoutBook(input: CheckoutBookInput!): CheckoutBookPayload!
}

Making this change is trivial, so I will leave it to the reader as an exercise. Do note that if you add .AddQueryFieldToMutationPayloads() to the server definition, it will add an additional query: Query field to your mutation payloads; this is so we can pull as all updated fields at once. Our checkout book command could then be:

mutation CheckoutBook($input: CheckoutBookInput, $bookId: UUID!) {
  checkoutBook(input: $input) {
    user {
      id name
    }
    query {
      book(id: $bookId) {
        id name isbn
      }
    }
  }
}

Find the full details about adding Relay support to a Hot Chocolate server at (15).

References

  1. Previous blog post: https://superdevelopment.com/2022/11/10/graphql-server-with-hot-chocolate/
  2. GraphQL Server: https://github.com/Hosch250/Library-DDD/tree/graphQLHotChocolate
  3. Strawberry Shake Client Demo: https://github.com/Hosch250/Library-DDD/tree/graphQLHotChocolate/GraphQLClient
  4. Get Started with Strawberry Shake: https://chillicream.com/docs/strawberryshake/get-started
  5. Apollo Client Demo: https://github.com/Hosch250/graphql-web-clients/tree/main/apollo
  6. Get Started with Apollo: https://www.apollographql.com/docs/react/get-started
  7. GraphQL Code Generator: https://www.graphql-code-generator.com/docs/guides/react
  8. Apollo Cache: https://medium.com/rbi-tech/tips-and-tricks-for-working-with-apollo-cache-3b5a757f10a0
  9. Relay Client Demo: https://github.com/Hosch250/graphql-web-clients/tree/main/relay
  10. Relay Compiler: https://github.com/facebook/relay/blob/main/packages/relay-compiler/README.md
  11. Relay Mutations: https://relay.dev/docs/guided-tour/updating-data/graphql-mutations/
  12. Relay Routing: https://relay.dev/docs/v1.6.1/routing/
  13. Testing Relay: https://relay.dev/docs/guides/testing-relay-components/
  14. Relay Server Specification: https://relay.dev/docs/guides/graphql-server-specification/
  15. Relay with Hot Chocolate: https://chillicream.com/docs/hotchocolate/defining-a-schema/relay

Domain-Driven Design

You’ve decided to use Domain-Driven Design (DDD), but aren’t sure how to implement it. Maybe you’ve seen it go wrong before and aren’t sure how to prevent that happening again. Maybe you’ve never done it and aren’t sure where to start. This post will show you how to implement a DDD domain layer, including aggregates , value objects, domain commands, and validation, and how to avoid some of the pitfalls I’ve seen. It will not discuss the why of DDD vs other competing patterns; nor, for the sake of brevity, will it discuss the infrastructure or application layers of a DDD app. To demonstrate these concepts in action, I have built a backend for a library using DDD; the most relevant sections will be shown in the post, and the full version can be found on GitHub. The tech stack I used is an ASP.NET Core API written in C# backed by a Mongo DB.

The Aggregate Root

The aggregate root is the base data entity of a data model. This entity will contain multiple properties, which may be base CLR types or value objects. Value objects can be viewed as objects that are owned by the aggregate root. Each object, whether an aggregate root or value object, is responsible for maintaining its state. We will start by defined an abstract aggregate root type with properties all our aggregate roots will have:

public abstract class AggregateRoot
{
    public string AuditInfo_CreatedBy { get; private set; } = "Library.Web";
    public DateTime AuditInfo_CreatedOn { get; private set; } = DateTime.UtcNow;

    public void SetCreatedBy(string createdBy)
    {
        AuditInfo_CreatedBy = createdBy;
    }
}

Next, we will define an implementation of this type containing a couple internal constructors, a number of data properties, and a couple methods for updating the data properties. Looking through the implementation below, you will probably note that my data properties have private setters and methods for setting them. This looks a little strange when you consider that properties allow custom setters, but the reason for this is serialization. When we deserialize an object from our DB, we don’t want to have to go through any validation we might do when setting a property; we just want to read into the property and assume the data has already been validated. When the data changes, we need to validate it, so we make the property setters private and provide public methods to set the data. Another benefit the methods provide is you can pass a domain command to them, instead of just the final expected value of the property; this allows you to provide supplemental information as necessary.

public class User : AggregateRoot
{
    /// <summary>
    /// Used for deserialization
    /// </summary>
    [BsonConstructor]
    internal User(Guid id, string name, bool isInGoodStanding, List<CheckedOutBook> books)
    {
        Id = id;
        Name = name;
        IsInGoodStanding = isInGoodStanding;
        this.books = books;
    }

    /// <summary>
    /// Used by the UserFactory; prefer creating instances with that
    /// </summary>
    internal User(string name)
    {
        Id = Guid.NewGuid();
        Name = name;
        IsInGoodStanding = true;
    }

    public Guid Id { get; private set; }
    public string Name { get; private set; }
    public bool IsInGoodStanding { get; private set; }

    [BsonElement(nameof(Books))]
    private readonly List<CheckedOutBook> books = new();
    public IReadOnlyCollection<CheckedOutBook> Books => books.AsReadOnly();

    public async Task CheckoutBook(CheckoutBookCommand command)
    {
        // validation happens in any event handler listening for this event
        // e.g. Does the library have this book, is it available, etc.
        await DomainEvents.Raise(new CheckingOutBook(command));

        var checkoutTime = DateTime.UtcNow;
        books.Add(new CheckedOutBook(command.BookId, checkoutTime, checkoutTime.Date.AddDays(21)));
        DomainEvents.Raise(new CheckedOutBook(command));
    }

    public async Task ReturnBook(ReturnBookCommand command)
    {
        // validation happens in any event handler listening for this event
        // e.g. Does the user have this book checked out, etc.
        await DomainEvents.Raise(new ReturningBook(command));

        books.RemoveAll(r => r.BookId == command.BookId);
        DomainEvents.Raise(new ReturnedBook(command));
    }
}

public class CheckedOutBook
{
    public CheckedOutBook(Guid bookId, DateTime checkedOutOn, DateTime returnBy)
    {
        BookId = bookId;
        CheckedOutOn = checkedOutOn;
        ReturnBy = returnBy;
    }

    public Guid BookId { get; private set; }
    public DateTime CheckedOutOn { get; private set; }
    public DateTime ReturnBy { get; private set; }
}

Having POCOs or dumb objects (objects that aren’t responsible for maintaining their internal state) is often one of the first mistakes people make when doing DDD. They will create a class with public getters and setters and put their logic in a service (I will go over domain services and why you don’t usually want to use them later). The problem with this is that two places might be working with the same object instance at the same time and write data that the other is reading or writing, so the object risks ending up in an inconsistent state. DDD prevents inconsistent state by only allowing the object to set its own state, so if two consecutive changes to the same object would lead to inconsistent state, the object will catch that with its internal validation, instead of relying on the caller to have validated the change.

Domain Commands

Domain commands are how you tell an aggregate to update itself. In the code above, CheckoutBook and ReturnBook are domain commands. It isn’t strictly necessary to create a command type to represent the data being passed; you could have just passed a Guid bookId instead of a command class into the method. However, I like creating a command type because you have a single object to run validation against, and you can validate parameters when creating the command instance. For example, if your domain command requires a certain value be provided, you could validate that it’s not null in the type constructor instead of in the domain command itself. The validation on the type especially helps the logic flow well; you can’t really validate a Guid without additional context; you can validate a ReturnBookCommand type that contains a Guid, and you already have the additional context around what the Guid is.

public class CheckoutBookCommand
{
    public Guid BookId { get; }
    public Guid UserId { get; }

    public CheckoutBookCommand(Guid userId, Guid bookId)
    {
        if (bookId == Guid.Empty) { throw new ArgumentException($"Argument {nameof(bookId)} cannot be an empty guid", nameof(bookId)); }
        if (userId == Guid.Empty) { throw new ArgumentException($"Argument {nameof(userId)} cannot be an empty guid", nameof(userId)); }

        BookId = bookId;
        UserId = userId;
    }
}

Validation

You probably noticed the comments I had in the domain command implementations about validation. Validation is often tricky to get right in DDD because it uses other dependencies, such as a DB. For example, to successfully check out a book, the system has to make sure both the book and user are in the system, that the book is available, that the user is in good standing, etc. To do these, we already pulled the user from the DB to get the user aggregate, so we know the user is in the system. However, we haven’t checked that the book is in the system, so we need to reference a database instance when we do our validation inside the domain command. We can’t inject a DB instance into the aggregate because we don’t resolve aggregates from the IoC container, and even if we could, it’s not the aggregate’s responsibility to connect to the DB. We could new a DB instance up in the command, but that is wrong for reasons outside the scope of this article, in addition to not being the aggregate’s responsibility to talk to the DB (research Dependency Injection and Inversion of Control if you don’t know why). This is where our command system comes into play. Notice the DomainEvents.Raise call. I have that implemented with MediatR, which is a .NET implementation of the mediator pattern; see the link at the end of this article for more detail:

public static class DomainEvents
{
    public static Func<IPublisher> Publisher { get; set; }
    public static async Task Raise<T>(T args) where T : INotification
    {
        var mediator = Publisher.Invoke();
        await mediator.Publish<T>(args);
    }
}

We register IPublisher and our notifications and commands with our IoC container so we can resolve dependencies in our handlers. We then create a method that knows how to resolve an IPublisher instance and assign it to the static Publisher property in our startup. The static Raise method then has all the information it needs to raise the event and wait for the handlers to complete. In this example, I use the FluentValidation library for validation within these handlers. We could put an error handler in our HTTP response pipeline to catch ValidationExceptions and translate them into 400 responses.

public class CheckingOutBook : INotification
{
    public CheckoutBookCommand Command { get; }

    public CheckingOutBook(CheckoutBookCommand command) => Command = command;
}

public class CheckingOutBookValidationHandler : INotificationHandler<CheckingOutBook>
{
    private readonly CheckingOutBookValidator validator;

    public CheckingOutBookValidationHandler(CheckingOutBookValidator validator) => this.validator = validator;

    public Task Handle(CheckingOutBook @event, CancellationToken cancellationToken)
    {
        validator.ValidateAndThrow(@event.Command);

        return Task.CompletedTask;
    }
}

public class CheckingOutBookValidator : AbstractValidator<CheckoutBookCommand>
{
    public CheckingOutBookValidator(ILibraryRepository repository)
    {
        RuleFor(x => x.UserId)
            .MustAsync(async (userId, _) =>
            {
                var user = await repository.GetUserAsync(userId);
                return user?.IsInGoodStanding == true;
            }).WithMessage("User is not in good standing");

        RuleFor(x => x.BookId)
            .MustAsync(async (bookId, _) => await repository.GetBookAsync(bookId) is not null)
            .WithMessage("Book does not exist")
            .DependentRules(() =>
            {
                RuleFor(x => x.BookId)
                    .MustAsync(async (bookId, _) => !await repository.IsBookCheckedOut(bookId))
                    .WithMessage("Book is already checked out");
            });
    }
}

Creating Entities

At this point you may be wondering how we ensure an aggregate root is valid on initial creation since we can’t await results in a constructor the way we do in our command handlers inside the entity. This is a prime case for the use of factories; we’ll make our constructor internal to reduce the accessibility as much as possible and create a factory that makes any infrastructure calls it needs, calls the constructor, then raises an event with the newly created entity as data that can be used to validate it. This way, we encapsulate all the logic needed to create an event, instead of relying on each place an event is created to perform the logic correctly and ensure the entity is valid.

public class UserFactory
{
    public async Task<User> CreateUserAsync(string name)
    {
        var user = new User(name);
        await DomainEvents.Raise(new CreatingUser(user));

        return user;
    }
}

Domain Services

You are probably wondering at this point why I didn’t simply use a service to perform the checkout book command. For example, I could define the service with a method CheckoutBook(User user, Guid bookId), and perform all the validation inline, instead of importing MediatR and FluentValidation and creating 3 classes to simply validate my user. Then I would inject this service into whatever place calls the domain command and call the service instead of calling the domain command. I could still have my domain command be responsible for updating the entity instance to ensure it isn’t having random values assigned in places. The problem with this is I now have some logic in the service and some in my entity; how do I determine which logic goes where? When multiple devs are working on a project, this becomes very difficult to handle, and people have to figure out where existing logic is and where to put new logic. This issue often leads to duplicated logic, which leads to bugs when one is updated and the other isn’t, among other issues. Additionally, as I mentioned above, because the validation logic occurs outside my entity, I can no longer trust that the entity is in a valid state because I don’t know if the validation was run before the command to update the entity was called. Because DDD implemented correctly only allows the entity to update itself, we can validate data changes once inside the entity just before we update it, instead of hoping the caller remembered to fully validate the changes.

References

Increase Local Reasoning with Stateless Architecture and Value Types

It is just another Thursday of adding features to your mobile app.

You have blasted through your task list by extending the current underlying object model + data retrieval code.

Your front-end native views are all coming together. The navigation between views and specific data loading is all good.

Git Commit. Git Push. The build pops out on HockeyApp. The Friday sprint review goes well. During the sprint review the product manager points out that full CRUD (Create, Read, Update, Delete) functionality is required in each of the added views. You only have the ‘R’ in ‘CRUD’ implemented. You look through your views, think it just can’t be that bad to add C, U and D, and commit to adding full CRUD to all the views by next Friday’s sprint review.

The weekend passes by, you come in on Monday and start going through all your views to add full CRUD. You update your first view with full CRUD; start navigating through your app; do some creates, updates, and deletes; and notice that all of those other views you added last week are just broken. Whole swaths of classes are sharing data you didn’t know was shared between them. Mutation to data in one view has unknown effects on the other views due to the shared references to data classes from your back-end object model.

Your commitment to having this all done by Friday is looking like a pipe-dream.

[Read more…]

C# vs. Swift – Iron Man vs. Captain America

In Captain America: Civil War we get to see the ultimate battle between Iron Man and Captain America.

It is a battle of simple gutty defense vs. smart weapons and flashy offense, humility vs. brashness, down in the dirt vs. up in the clouds.

To totally geek it up, the same kind of battle exists in the languages that software engineers use today and I believe this is especially true in the battle of C# vs. Swift.

Don’t worry, this really isn’t a versus type write up. If anything I seek to point out each language’s unique strengths, then show how software engineers can get into the right superhero mindset to really use those strengths, and be aware of the weaknesses, to create great solutions.

[Read more…]

Implementing HAT​EOAS​: One Team’s Journey

hateoasHATEOAS stands for “Hypermedia as the Engine of Application State” and it is one of the possible constraints that you can place on a REST compliant API. Essentially what it means is that your API is as navigable as a normal website, with hyperlinks leading to other resources.  The focus of this blog is not HATEOAS itself – instead focusing on an implementation of it our team recently used for our project’s API.

[Read more…]

Managing Process Efficiently: Intro to the Disruptor Pattern

The Disruptor is, essentially, a scheduling strategy builder for multithreaded code. It stands out in the world of concurrent programming because it offers both great execution speed and easily readable and debuggable code. Yes, it does have a weird name. According to the original whitepaper, it was coined “Disruptor” because

it had elements of similarity for dealing with graphs of dependencies to the concept of “Phasers” in Java 7…

Of course, it is much more than just a Star Trek joke. The pattern was developed by the LMAX exchange to build a competitive, low-latency trading platform that could handle millions of transactions per second. Luckily for us developers, they have opened the source code to the public. The reference implementation is written in Java, but there is a C# implementation as well.

[Read more…]

C# / .NET for Mobile Development: Worth a Second Look

Over the past few years I believe that Microsoft, and their Xamarin partners, have created a compelling, quick, stable, and rich ecosystem for native development across Android, iOS, and Windows 10. It is finally time for native app developers to double back and take a look at C# / .NET based code for their native app platform needs in iOS, Android, and Windows 10.

[Read more…]

Garbage Collection and the Finalizer

One aspect of modern web development that sometimes seems to be taken for granted is memory management. While you might not need to create a custom boot disk anymore in order to run your application on a modern machine, it is still important to understand how your memory allocations are cleaned up. Two of the main components to cleaning up memory allocation are the garbage collector and the finalizer.
[Read more…]

Common Pitfalls with IDisposable and the Using Statement

Memory management with .NET is generally simpler than it is in languages like C++ where the developer has to explicitly handle memory usage.  Microsoft added a garbage collector to the .NET framework to clean up objects and memory usage from managed code when it was no longer needed.  However, since the garbage collector does not deal with resource allocation due to unmanaged code, such as COM object interaction or calls to external unmanaged assemblies, the IDisposable pattern was introduced to provide developers a way to ensure that those unmanaged resources were properly handled.  Any class that deals with unmanaged code is supposed to implement the IDisposable interface and provide a Dispose() method that explicitly cleans up the memory usage from any unmanaged code.  Probably the most common way that developers dispose of these objects is through the using statement.
[Read more…]