Typesetting Markdown – Part 4: Theme Style

This part of the series describes a way to define colours, fonts, and layout such that content is separated from presentation.

Introduction

Part 3 demonstrated a build script that performs continuous integration when writing documents. Before styling a minimal document theme using ConTeXt, let’s see how to apply the open-closed principle to the continuous integration script.

Open-Closed Principle

Writing reusable software is a lofty goal. Part 3 defined the shell scripts build and ci, where ci was a modified version of build. Ideally, the ci script would only need to implement functionality specific to its purpose. Common behaviour can be abstracted to a parent script template that a child script inherits. That is, the parent build script template can be open to having its behaviour changed by its children, yet be closed for modification.

Start by renaming the build script to build-template, or jump to the download build script template section to download the updated script.

Revise Build Script Template

This section describes how to change the build-template to make most of its functionality reusable without creating additional copies.

Remove main Call

Making build-template the parent of potentially multiple child scripts requires removing the following line:

main "$@"

Knowing that bash executes scripts from the top-down, when the parent script (build-template) is included by the child script (ci), that line would otherwise be called before the child script has a chance to influence the parent’s behaviour.

Change Colour Definition Constants

Where before the colour definitions were declared readonly, now child scripts may require overriding colour preferences set by the parent script. Change the lines in build-template to the following:

COLOUR_LOGGING=${COLOUR_BLUE}
COLOUR_WARNING=${COLOUR_YELLOW}
COLOUR_ERROR=${COLOUR_DKRED}

Now a child script can assign those global values to different colours.

Define Help Arguments

By default, each child script inherits the parent’s command-line arguments. Put them into an array using the following syntax:

ARGUMENTS=(
  "-d,--debug,Log messages while processing"
  "-h,--help,Show this help message then exit"
)

The comma-separated values are parsed in a function that displays the script’s usage. Note how each array element is on a line by itself.

Define Software Requirements

Similar to the help arguments, define another array that lists software requirements:

DEPENDENCIES=(
  "printf,https://www.gnu.org/software/coreutils/"
)

Most distributions have the printf command, so the single array value is more of a placeholder for child scripts to overwrite than anything else.

Argument Parsing Overrides

Rename parse_commandline() to arguments() and redefine it so that child scripts have the opportunity to parse a command-line argument. The code resembles the following:

arguments() {
  while [ "$#" -gt "0" ]; do
    local consume=1

    case "$1" in
      -d|--debug)
        log=utile_log
      ;;
      -h|-\?|--help)
        usage=utile_usage
      ;;
      *)
        set +e
        argument "$@"
        consume=$?
        set -e
      ;;
    esac

    shift ${consume}
  done
}

Previously, set -o errexit prevented errors from cascading throughout the script. Any command or function that returns a non-zero value will cause the script to terminate. Giving child scripts the ability to parse their own command-line arguments requires that the parent script be instructed to consume an appropriate number of arguments. The snippet from the above function that accomplishes this is as follows:

        set +e
        argument "$@"
        consume=$?
        set -e

The first line disables errexit mode, which allows the argument function to return non-zero values without terminating the shell script. The second line calls the argument function. (To be defined shortly.) The third line uses the return value from the argument function to determine how many command-line arguments must be consumed. The last line restores errexit mode.

As for the argument function, it is simply:

argument() {
  return 1
}

At this point, child scripts can redefine the argument function to parse additional command-line arguments.

Revise Usage Message

Rather than hard-code each printf statement that corresponds to a command-line argument, introduce a loop that iterates over the previously-defined ARGUMENTS array, such as:

utile_usage() {
  local args=()
  mapfile args < <(IFS=$'\n'; sort <<< "${ARGUMENTS[*]}") 

  printf "Usage: %s [OPTION...]\n" "${SCRIPT_NAME}" >&2

  for argument in "${args[@]}"; do
    # Extract the short [0] and long [1] arguments and description [2].
    IFS=',' read -ra arg <<< "${argument}"

    printf "  %s, %-15s%s\n" "${arg[0]}" "${arg[1]}" "${arg[2]}" >&2
  done

  unset IFS
  return 0
}

It is customary to list command-line arguments alphabetically, which is accomplished by the first two lines.

The difference between ARGUMENTS[@] and ARGUMENTS[*] is subtle. Where ARGUMENTS[*] combines all values into one string, ARGUMENTS[@] requotes the individual arguments. Another difference is that when IFS is set, using * splits the variable by the value of IFS. The individual lines are fed into the sort command, which sorts from standard input alphabetically, then the output is stored into the args array.

In bash, information about and referencing variables happens inside the delimiters ${ and }. In the following loop construct, args[@]—where the @ means all indexes and preserves spaces between elements having spaces between words—is the syntax to use when iterating over the args’ array elements:

  for argument in "${args[@]}"; do

Next, setting IFS=',' changes the internal field separator to a comma. The internal read command uses the IFS value to split lines into separated words. Since the ARGUMENTS variable contains comma-separated values, using IFS and read is a terse, idiomatic way to parse the arguments.

The last line of the loop is now a singular representation for how each help line is displayed:

    printf "  %s, %-15s%s\n" "${arg[0]}" "${arg[1]}" "${arg[2]}" >&2

Recall that all ARGUMENTS array elements must have three comma-delimited values, as shown here:

  "-h,--help,Show this help message then exit"

The first comma-delimited value (-h) is the short-hand argument parsed into ${arg[0]}, the second value (--help) is the long-hand argument parsed into ${arg[1]}, and the third value is the message to display. These are mapped to the three %s values provided as the first argument to printf.

To align the message text, accounting for long long-hand argument names, the double tabs (\t\t) are replaced with %-15s, which preserves 15 characters for the long-hand name—including spaces—and left-aligns (-) the text.

Eliminating multiple occurrences of printf by introducing a loop has made the code easier to maintain. Future versions of the template could call a function to display the text, which would allow child scripts to control how each help item is displayed—independently of the loop that displays them.

Revise Requirements Validation

Validation is separated into distinct functions: one to iterate over all software requirement items and one to check each requirement. This helps avoid deeply indented code and upholds the single responsibility principle. As a side-benefit, the code that verifies each requirement no longer needs the global REQUIRED_MISSING variable. Use a local variable, like so:

required() {
  local result=0

  command -v "$1" > /dev/null 2>&1 && result=1 || \
    warning "Missing requirement: install $1 ($2)"

  return ${result}
}

If the command command succeeds it means that the required dependency exists, so result can be set to 1; otherwise, result will remain 0 and a warning is displayed.

Consider the following algorithm:

  1. Count the number of software requirements.
  2. Get the name and website for each required software tool.
  3. Check whether the required command exists.
  4. Display a warning to the user if the tool is not found.
  5. Track the number of required software tools found.
  6. Answer whether all commands are available.

Here is one possible rewrite of the original validate_requirements function that implements the algorithm:

requirements() {
  $log "Verify requirements"
  local -r expected_count=${#DEPENDENCIES[@]}
  local total_count=0

  for dependency in "${DEPENDENCIES[@]}"; do
    IFS=',' read -ra dependent <<< "${dependency}"

    required "${dependent[0]}" "${dependent[1]}"
    total_count=$(( total_count + $? ))
  done

  unset IFS

  return $(( total_count / expected_count ))
}

Code for parsing the comma-delimited list of dependencies is similar enough to the help-parsing code of utile_usage() that refactoring is tempting. After all, both functions are parsing comma-delimited lists, which is a form of duplication. Let’s leave this as an exercise for the reader.

The return statement performs integer division to determine whether to return 1 or 0. If the script has 5 dependencies but only 4 are found, the calculation is 4 / 5, or 0.8, which is truncated to 0. If all 5 are found, then the script evaluates 5 / 5, which equals 1. (There will be a divide-by-zero error if the DEPENDENCIES array is empty.)

Mathematical expressions in bash are delimited by $(( and )), with some exceptions.

Processing

Changing the working directory can be considered a prerequisite to executing commands. Implement a preprocess function in the parent script to provide this functionality distinct from the execute_tasks function. For example:

preprocess() {
  $log "Preprocess"
  local result=1

  # Track whether change directory failed.
  cd "${SCRIPT_DIR}" > /dev/null 2>&1 || result=0

  return "${result}"
}

For symmetry, include a postprocess function that is called after all the script’s main work is accomplished, but before the script’s happy path terminates:

postprocess() {
  $log "Postprocess"

  return 1
}

Rename execute_tasks() to execute() and replace its code as follows:

execute() {
  return 1
}

Each child script now has the opportunity to override what happens before, after, and during execution.

Rewrite Main

After all the preliminary refactoring is complete, rewrite main() as follows:

main() {
  arguments "$@"

  $usage       && terminate 3
  requirements && terminate 4

  preprocess   && terminate 5
  execute      && terminate 6
  postprocess  && terminate 7

  terminate 0
}

If any function call from $usage to postprocess returns 0, the script will terminate with an exit level of 3 through 7, respectively. Stated another way, each function must return 0 for the script to proceed to the next function call.

Reference Script Directory

Earlier, the following code was used to get the full path to the script’s working directory:

readonly SCRIPT_SRC="$(dirname "${BASH_SOURCE[0]}")"

The value of SCRIPT_SRC will be the path relative to the parent script (i.e., $HOME/bin), which is not the child script’s path. Without addressing this issue, the continuous integration script will monitor $HOME/bin for changes instead of the child script’s own directory. Get the child script’s working directory by changing the line to the following:

readonly SCRIPT_SRC="$(dirname "${BASH_SOURCE[${#BASH_SOURCE[@]} - 1]}")"

The parent script’s functionality is now reusable by child scripts. Note that the most robust way to resolve the directory is for the reader to implement.

Download Build Script Template

Download the build script template, distributed under the MIT license.

Install Build Script Template

Install the build script template into $HOME/bin and change its permissions, for example:

mkdir -p $HOME/bin
cd $HOME/bin
mv $HOME/downloads/build-template . # Change path as necessary
chmod 644 build-template

Note that the build-template does not need to be executable since it is never run directly. If $HOME/bin does not exist in the PATH environment variable, add it by appending the following line to $HOME/.bashrc:

export PATH="$PATH:$HOME/bin"

Open a new terminal window for the updated PATH value to take effect.

The parent script is extensible and can be referenced by any child script.

Revise Continuous Integration Script

Having an extensible shell script template reduces the amount of code in the continuous integration script. This section describes a small script that performs continuous integration. If you’re already familiar with bash, skip to the download continuous integration script section.

Source Parent Script

Near the top of the script, the first line after the shebang must be:

source build-template

This line imports the functions and global variables from the parent script into the child script. By making it the first line of code that bash encounters, it affords the child script with the opportunity to redefine functions and global variables imported from the parent script.

The internal source command can find and import external shell scripts located in directories listed in the PATH environment variable.

Set Dependencies

Overwrite the global DEPENDENCIES array as follows:

DEPENDENCIES=(
  "inotifywait,https://github.com/rvoicilas/inotify-tools/wiki"
  "context,https://wiki.contextgarden.net"
  "pandoc,https://www.pandoc.org"
  "gs,https://www.ghostscript.com"
)

Instead of searching for a single printf dependency, the parent’s requirements function will look for the tools given in the above comma-delimited lists.

Override Execute Function

Next, override the execute function to look as follows:

execute() {
  local -r await=close_write,delete

  $log "Await file modifications in ${SCRIPT_DIR}"
  inotifywait -q -e "${await}" -m . styles | \
  while read -r directory event filename; do
    if [[ ! "${event}" == *ISDIR* ]] && filter "${filename}"; then
      $log "${directory}${filename} (${event})"
      build_document
    fi
  done

  return 1
}

Rename all occurrences of execute_build to build_document to improve clarity of intent.

Create File Name Filter

Create the following filter function:

filter() {
  [[ "${1,,}" == *\.*md || "${1,,}" == *\.tex ]]

  return $?
}

The filter function verifies whether a given file name should trigger rebuilding the document. If the file name extension matches either a Markdown or TeX file, the conditional expression will cause the value of $? to be set to 0; otherwise, the value is set to 1 and the document will not be rebuilt.

Parse Command-line Arguments

As written, the parent script loops over all the command-line arguments, parsing out help and debug settings. To avoid duplicating that functionality, any unrecognised arguments are delegated to the argument function for further parsing. The following snippet shows how to parse child-specific command-line arguments by overriding the parent’s argument function:

argument() {
  local consume=1

  case "$1" in
    -f|--filename)
      ARG_FILE_OUTPUT="$2"
      consume=2
    ;;
  esac

  return ${consume}
}

The value returned indicates the number of command-line arguments that were successfully consumed. The parent script skips over the number of command-line arguments specified by the function’s return value.

Update the help for the command-line arguments as well by placing the following code fragment after the DEPENDENCIES variable:

ARGUMENTS+=(
  "-f,--filename,Output PDF file name"
)

Call Main

Knowing that build-template no longer calls main, the last line of the child script must be:

main "$@"

Update Build Function

Controlling the document’s presentation involves enclosing the ConTeXt document fragment produced by pandoc within a custom TeX file. First change build_document() as follows:

build_document() {
  local -r DIR_BUILD="artefacts"
  mkdir -p "${DIR_BUILD}"

  local -r FILE_MAIN_PREFIX="main"
  local -r FILE_BODY_PREFIX="${DIR_BUILD}/body"

  local -r FILE_CAT="${FILE_BODY_PREFIX}.md"
  local -r FILE_TEX="${FILE_BODY_PREFIX}.tex"
  local -r FILE_PDF="${FILE_BODY_PREFIX}.pdf"
  local -r FILE_DST="$(basename "${ARG_FILE_OUTPUT}" .pdf).pdf"

  $log "Concatenate into ${FILE_CAT}"
  cat ./??.md > "${FILE_CAT}"

  $log "Generate ${FILE_TEX}"
  pandoc --to context "${FILE_CAT}" > "${FILE_TEX}"

  $log "Generate ${FILE_PDF}"
  context --nonstopmode --batchmode --purgeall \
    --path=artefacts,styles \
    "${FILE_MAIN_PREFIX}.tex" > /dev/null 2>&1

  $log "Rename ${FILE_MAIN_PREFIX}.pdf to ${FILE_DST}"
  mv "${FILE_MAIN_PREFIX}.pdf" "${FILE_DST}"
}

Note the following modifications:

Keeping user-friendliness in mind, consider the following line:

  local -r FILE_DST="$(basename "${ARG_FILE_OUTPUT}" .pdf).pdf"

This allows the user to specify a file name with or without a .pdf extension. The second argument to basename instructs the command to strip any extension that matches the one given. If the file name does not end with .pdf, then no action is taken by basename. In both cases, .pdf is appended. Be aware that the extension comparison is case-sensitive.

Run Linter

At time of writing, shellcheck cannot resolve sourced script paths using environment variables. Running shellcheck is a good idea, but hard-coded paths are necessary for it to lint both parent and child scripts successfully. Avoid hard-coded paths because—unless the scripts were installed system-wide—it would mean hard-coding a username.

Download Continuous Integration Script

Download the new continuous integration script, distributed under the MIT license.

Install Continuous Integration Script

Overwrite the ci script from Part 3 as follows:

mkdir -p $HOME/dev/writing/book
mv $HOME/download/ci $HOME/dev/writing/book

The first line will create the destination directory or fail silently if it already exists. The second line moves the ci script from where it was downloaded into the destination directory; change the download directory path as per your system’s configuration.

Style

ConTeXt documentation is extensive and its wiki has a large listing of commands. The remainder of this part provides a brief introduction to ConTeXt by producing a simply-styled PDF file.

Macros

Typesetting operations in ConTeXt begin with a backslash (\). To avoid confusion with system commands run from bash, ConTeXt operations are hereinafter referred to as macros, rather than commands. Example macros include:

Many ConTeXt macros have configuration options. The option values are listed in square brackets ([ and ]) following the macro name. For example, the following line inserts more vertical whitespace than \blank does alone:

\blank[big]

Options for macros can have key-value pairs, called setups. For example, drawing a blue border around Text uses the following syntax:

\framed[framecolor=blue]{Text}

The key is framecolor and its value is blue. The framed macro operates on the content between the open brace ({) and close brace (}). That is, the scope of the macro is constrained to the text placed between braces (also called curly brackets) that follow.

Typographical Operations

Typographical operations performed on text are typically enclosed by \start and \stop macros, such as:

\starttext
  Document body
\stoptext

Many operations already exist, including:

New operations can be defined in terms of existing operations using the define prefix. Consider this example, which creates a new type of framed macro:

\defineframed[WarningFrame][
  background=color,
  backgroundcolor=yellow,
  frame=off
]

The first option (WarningFrame) names the new frame. The setups change the frame’s properties. The new frame could be used as follows:

\WarningFrame{Never use open flame to check fuel level.}

Existing operations can be reconfigured using the setup prefix. For example, the following snippet changes all \framed occurrences in a document to use a lightgray background colour:

\setupframed[
  background=color,
  backgroundcolor=lightgray,
  frame=off,
]

This ability to reconfigure macros globally using setups makes separating a document’s content from its appearance relatively easy.

Main Enclosure

Create a new file called main.tex in $HOME/dev/writing/book having the following contents:

\input constants
\input colours

\input properties
\input paper
\input layouts
\input fonts
\input headings

\starttext
  \startfrontmatter
    % Table of contents
    \completecontent
  \stopfrontmatter

  \startbodymatter
    \input body
  \stopbodymatter

  \startbackmatter
    % ...
  \stopbackmatter
\stoptext

ConTeXt will scan both artefacts and styles for file names matching those specified by each \input line—note that the .tex extension is presumed by default. If any file is missing, ConTeXt will stop processing the file and then terminate.

Create Empty Styles

For now, create the styles directory and corresponding files that are referenced in main.tex:

mkdir -p $HOME/dev/writing/book/styles
cd $HOME/dev/writing/book/styles
touch constants.tex colours.tex properties.tex \
  paper.tex layouts.tex fonts.tex headings.tex

Restart Continuous Integration Script

Run the ci script as follows:

./ci -d -f story

The output should resemble:

[15:03:49.0815] Verify requirements
[15:03:49.0842] Preprocess
[15:03:49.0858] Await file modifications in .../book

When the PDF file is generated it will be named story.pdf.

Create a Chapter

Create or change $HOME/dev/writing/book/01.md to the following:

# Chapter Title

A **bold** sample.

## Section Title

An _italicised_ sample.

### Subsection Title

A **_bold italicised_** sample.

After saving, the ci script shows:

[15:03:05.5680] ./01.md (CLOSE_WRITE,CLOSE)
[15:03:05.5724] Concatenate into artefacts/body.md
[15:03:05.5754] Generate artefacts/body.tex
[15:03:05.6449] Generate artefacts/body.pdf
[15:03:07.6466] Rename main.pdf to story.pdf

Open story.pdf with Evince.

Define Constants

Often an author’s name and book title will both appear multiple times throughout the contents. Rather than maintain the values in multiple places, consider defining macros for them in the file constants.tex as follows:

\def\BookTitle{Book Title}
\def\BookSubtitle{Book Subtitle}
\def\BookAuthorPrimary{Your Name}
\def\BookKeywords{keyword1, keyword2}

Save the file. The ci script will trigger, but no changes will appear in the PDF file reader.

Define Colours

Ultimately, the choice of colour palette is best left to a professional designer. Nonetheless, a three-colour palette could be defined as follows:

\definecolor[ColourPrimary][h=542437]
\definecolor[ColourSecondary][h=C02942]
\definecolor[ColourTertiary][h=53777A]

\definecolor[ColourPrimaryDk][h=2c0615]
\definecolor[ColourPrimaryLt][h=935e73]
\definecolor[ColourSecondaryDk][h=740012]
\definecolor[ColourSecondaryLt][h=f28092]
\definecolor[ColourTertiaryDk][h=244d51]
\definecolor[ColourTertiaryLt][h=a6b6b8]

\definecolor[ColourBody][ColourPrimaryDk]
\definecolor[ColourHyperlink][ColourTertiary]

\setupcolors[
  state=start,
  rgb=yes,
  spot=no,
  pagecolormodel=auto,
  textcolor=ColourBody,
]

These definitions make consistent use of colours easier to achieve. Using pagecolormodel=auto prevents an issue with transparent colours.

Setup Document Properties

Usually entries in the table of contents are hyperlinked to the corresponding sections in the body. To link the table of contents, edit the properties.tex file then insert the following contents:

\setupinteraction[
  state=start,
  title={\BookTitle},
  subtitle={\BookSubtitle},
  author={\BookAuthorPrimary},
  keyword={\BookKeywords},
  color=ColourHyperlink,
  contrastcolor=ColourHyperlink,
]

The setupinteraction macro can enable document hyperlinks (state=start) and can set the title, subtitle, author, and keywords.

Save the file. Page one of the PDF file resembles:

Define Fonts

Due to their flexibility, controlling a document’s fonts can be a bit involved. Start by choosing a suitable font pairing (see also selections from professional designers), which typically includes a body font and suitable header font.

Once the fonts are installed (e.g., into $HOME/.fonts/ttf), update the font cache for both the operating system and ConTeXt by issuing the following commands:

fc-cache -fv
mtxrun --script fonts --reload

The fc-cache command scans the font directories on the system and builds font information cache files for applications. The mtxrun command is a helper script that can, amongst its many features, display known fonts.

List and find the name of the installed fonts by creating and using a new script to search for fonts as follows:

pushd $HOME/bin
echo '#!/usr/bin/env bash' > fontgrep
echo 'mtxrun --script fonts --list --all --name $1 | cut -d" " -f1' >> fontgrep
chmod +x fontgrep
popd
fontgrep aleo && fontgrep arimo

If the fonts cannot be found, ensure OSFONTDIR is set in .bashrc as previously described. Note the exact names of the font family entries.

Next, create a typescript definition for the content body and headings by inserting the following text into the fonts.tex file:

\starttypescript [serif] [BookBodyAleo]
  \definefontsynonym
    [Serif]           [name:aleonormal]     [features=default]
  \definefontsynonym
    [SerifBold]       [name:aleobold]       [features=default]
  \definefontsynonym
    [SerifItalic]     [name:aleoitalic]     [features=default]
  \definefontsynonym
    [SerifBoldItalic] [name:aleobolditalic] [features=default]
\stoptypescript

\starttypescript [sans] [BookHeadingsArimo]
  \definefontsynonym
    [Sans]           [name:arimonormal]     [features=default]
  \definefontsynonym
    [SansBold]       [name:arimobold]       [features=default]
  \definefontsynonym
    [SansRegular]    [name:arimoregular]    [features=default]
  \definefontsynonym
    [SansBoldItalic] [name:arimobolditalic] [features=default]
\stoptypescript

Font names that ConTeXt recognises can be determined using fontgrep, which calls mtxrun to list fonts that partially match a given name.

Any number of fonts can be defined as typescripts in this way, which allows the document typefaces to be changed by swapping the values in the relevant definetypeface macro, described below. The following table lists the basic typeface styles:

StyleMeaningTypeExample
rmRomanSerifTinos
ssSans serifSans serifRoboto
ttTeletype textMonospaceInconsolata
mmMathematicsMathTex Gyre Family
hwHandwrittenCursivePinyon Script
cgCalligraphyCalligraphicTangerine

The style name is passed as an option to the definetypeface macro. For example, append the following lines to fonts.tex so that the desired fonts are associated with the relevant styles:

\definetypeface
  [BookTypeface] [rm] [serif] [BookBodyAleo]      [default]
\definetypeface
  [BookTypeface] [ss] [sans]  [BookHeadingsArimo] [default]

Update fonts.tex again to setup the body font with \setupbodyfont:

\setupbodyfont[BookTypeface]

When pandoc converts Markdown to ConTeXt code, text emphasis is marked up using the em macro. ConTeXt makes a distinction between emphasised text (\em) and italicised text (\it). Force ConTeXt to use italicised text for emphasis by appending the following snippet to the end of fonts.tex:

\setupbodyfontenvironment[default][
  em=italic,
]

Note that when using OpenType Fonts (.otf files), the following may be required to force emphasis as italics:

\definefontfeature[default][default][itlc=yes]

Page two of the document resembles:

Note how the heading font is the same as the body font. By default, \setupbodyfont changes the typeface for all aspects of the document, which includes the text body and headings.

Headings

Set the headings to use a sans serif font in various font sizes as follows:

\setuphead[section][
  style=\ss\tfd,
  textcolor=ColourPrimary,
  numbercolor=ColourPrimary,
]

\setuphead[subsection][
  style=\ss\tfc,
  textcolor=ColourSecondaryDk,
  numbercolor=ColourSecondaryDk,
]

\setuphead[subsubsection][
  style=\ss\tfa,
  textcolor=ColourTertiaryDk,
  numbercolor=ColourTertiaryDk,
]

Switching font styles is accomplished using \rm, \ss, \tt, \hw, and so forth. Font size switching uses \tfa through \tfd. By stacking the macros, the following output is achieved:

As expected, all headings are in a sans serif font.

Paper Size

Paper size and layout are related concepts. A postcard may have a paper size of 6 x 4¼ inches while its layout has no spacing for margins, headers, or footers. A variety of units for dimensions are available, including: points (pt), millimetres (mm), centimetres (cm), inches (in), and more. My preference is to place the unit of measurement directly after the digits, with no spaces. To change the paper size, first define a new paper size and then use it, such as:

\definepapersize[BookPaperSize][
  width=6in,
  height=4.25in,
]

\setuppapersize[BookPaperSize]

After saving the changes, the paper size will become postcard-sized:

Layout

Configuring the page layout follows the same approach as changing the paper size: first define a new layout, then setup the layout. Replace the contents of layouts.tex with the following:

\definelayout[BookLayout][
  topspace=\zeropoint,
  backspace=\zeropoint,
  width=\paperwidth,
  height=\paperheight,
  header=\zeropoint,
  footer=\zeropoint,
]

\setuplayout[BookLayout]

The page margins disappear, as shown in the following screen shot:

All of the page is available for content.

Visual Debugging

ConTeXt offers ways to reveal how the content is affected by the document layout. For example, change layouts.tex as follows:

\definelayout[BookLayout][
  header=\zeropoint,
  footer=\zeropoint,
]

\setuplayout[BookLayout]

\showframe

Keeping only the header and footer setups while including \showframe reveals the margins:

Additional macros to show a variety of settings are listed on the wiki’s Debugging page. In particular, \showgrid and \showlayout are helpful.

Download

Download the entire configuration, distributed under the MIT license.

Summary

This part explained how to apply the open-closed principle to the build script template. In addition, this part introduced basic ConTeXt concepts, including how to configure colours, fonts, paper sizes, page layouts, and visual debugging. Part 5 describes how to use variables within Markdown documents.

Conclusion

About the Author

My software development career has spanned telecommunications, enterprise-level e-commerce solutions, finance, transportation, modernization projects in health and education, and much more.

Always delighted to discuss new opportunities, especially meaningful work with revolutionary companies that care about the environment.