Engineer the web, together.

Symbiote news

Keep in touch with what we're up to.

RSS available too if that's your thing!

New sites, new digs

Marcus Nyeholt

Posted 1 May 2018


It's been a busy time here at Symbiote; not only are we growing to the point of needing to find a new office (stay tuned for office warming info!), but we've got a new PTV site in the pipeline. 

Built with React and making use of a an AWS backend, the beta site is a taste of things to come with a full release later this year. Of course, under the covers is SilverStripe managing the core content delivery and initial data presentation. 

PTV's beta website

GUI apps with Docker

Marcus Nyeholt

Posted 27 Mar 2018


Docker is a thing here at Symbiote - we use it pretty extensively for our development environments, and are working on getting it out as a hosting option. While it's great for "server" apps, gui apps can tend to be a bit more difficult to get going.

I recently needed to get JMeter running again, and wanted a replicable environment for running the workbench application for putting together the tests that I could share with others. Thankfully, someone else has done the hard work of figuring out how to get things going - and after a couple of hoop jumps, have a functional Dockerfile below

# inspired by  and
# and 

# Docker run with 
# docker run --rm -it  \
#     --name jmeter \
#     -v /tmp/.X11-unix:/tmp/.X11-unix \
#     -v /tmp:/tmp \
#     -v $HOME/.Xauthority:/root/.Xauthority \
#     -w /tmp \
#     symbiote/jmeter:3.3

FROM ubuntu:16.04

ENV JMETER_HOME /opt/apache-jmeter-${JMETER_VERSION}

ARG TZ="Australia/Melbourne"

RUN sed 's/main$/main universe/' -i /etc/apt/sources.list && \
    apt-get update && apt-get install -y software-properties-common && \
    add-apt-repository ppa:webupd8team/java -y && \
    apt-get update && \
    echo oracle-java8-installer shared/accepted-oracle-license-v1-1 select true | /usr/bin/debconf-set-selections && \
    apt-get install -y oracle-java8-installer libxext-dev libxrender-dev libxtst-dev curl && \
    apt-get clean && \
    mkdir -p /tmp/dependencies && \
    curl -L --silent ${JMETER_DOWNLOAD_URL} >  /tmp/dependencies/apache-jmeter-${JMETER_VERSION}.tgz  && \ 
    mkdir -p /opt && \
    tar -xzf /tmp/dependencies/apache-jmeter-${JMETER_VERSION}.tgz -C /opt && \
    rm -rf /var/lib/apt/lists/* && \
    rm -rf /tmp/*

# TODO: plugins (later)
# && unzip -oq "/tmp/dependencies/JMeterPlugins-*.zip" -d $JMETER_HOME

# Set global PATH such that "jmeter" command is found

# The magic bit that makes the .x11 mapping work out
RUN mkdir -p /home/developer && mkdir -p /etc/sudoers.d && \
    echo "developer:x:1000:1000:Developer,,,:/home/developer:/bin/bash" >> /etc/passwd && \
    echo "developer:x:1000:" >> /etc/group && \
    echo "developer ALL=(ALL) NOPASSWD: ALL" > /etc/sudoers.d/developer && \
    chmod 0440 /etc/sudoers.d/developer && \
    chown developer:developer -R /home/developer


USER developer
ENV HOME /home/developer

CMD ${JMETER_BIN}/jmeter

Cleaning up after yourself

Marcus Nyeholt

Posted 6 Mar 2018


Ever taken a backup of a site database and gone to copy it somewhere, and been surprised how long it's taking to do so? Tried copying a site database locally to debug an issue and wonder why it fails restoring? 

We have a few modules we use regularly - Queued Jobs and Data Change Tracker are probably the most guilty - that can silently fill a database table with millions of records depending on your usage. Consider the following job

class RegularJob extends AbstractQueuedJob {

    public function process() {

        $nextJob = new RegularJob();

        // run every five minutes except between midnight and 7am
        $nextTime = date('H') < 7 ? date('Y-m-d 07:00:00') : date('Y-m-d H:i:s', time() + 300);

        $this->queuedJobService->runJob($nextJob, nextTime);

Pretty straight-forward; every 2 minutes, run some expensive processing code unless it's between midnight and 7am. The side effect of this being that every day, there's over 500 records being added to the QueuedJobDescriptor table, on top of any other jobs running. Not a huge deal, but over a year this is over 180000 records added. With other processing jobs happening, in a very interactive site this can lead to a multi-million row table. 

DataChange Tracker is another that has this slow creep of content buildup, exacerbated by the fact that in its most aggressive mode, it will capture data object contents (before + after) and request variables. 

Solving it with ... more jobs.

Luckily, both these modules come with cleanup jobs to ease the process. 

CleanupJob - Removes all jobs that have completed and are older than 30 days (you can configure this value from .yml). This can be created from the Jobs admin section of the CMS. 

PruneChangesBeforeJob - Create via the Jobs admin section with a single constructor argument being a strtotime compatible string representing how long ago to prune data changes before. For example, "-1 month" will remove all tracked changes from before 1 month. We find that -3 months works pretty well here. 

1 2 3 4 5

Page 1 of 5