MoreRSS

site iconWade UrryModify

A software developer from the UK, currently Lead Architect at Radweb.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Wade Urry

Being a new manager

2025-04-30 08:00:00

This post is a bit of a ramble

I've been managing a team, officially, coming up to a year now. Overall it's been a positive and energising experience, but as with every journey there are an array of challenges and hurdles along the way.

Transitioning from being a member of the team for 7 years to managing that same team has been an emotional and confidence testing challenge. Over those 7 years friendships have formed, trust has been built, and now those relationships are being tested. Being a friend and manager at the same time is a difficult balance, trying to avoid playing favourites and treating the team fairly. It's difficult to accept but those friendships will and have changed.

Keeping everyone happy is an impossible task, and you will upset people along the way, particularly when difficult and unappealing decisions need to be made. You are not here to be liked, or win a popularity contest. You are here to lead the team, challenge their skills, and grow together. This has been a difficult pill to swallow knowing your decisions can put pressure and stress on your team.

Feeling productive is a challenge when disruptions to your day are frequent, fires need to be put out, and your team, as well as other departments, need your input. What feels like an unproductive day full of interruptions can actually be hugely beneficial to the team, where you've helped unblock several team members and now multiple projects can move forward. It's easy to get lost in the day, but stepping back and viewing the bigger picture is important.

Learning your team's individual strengths and understanding how to utilize them is vitality important to delivering projects on time and to a high standard.

Hiring based on skill is rarely the right choice. Enthusiasm, passion, and personality that fits in with the team is far more important. Skills can be learned, mentored, and developed.

Being open with your team a difficult balancing act. Sharing information and being transparent can help your team to understand the reasoning behind decisions as well as push their motivation to delivery, but it can also lead to frustration and anger when the information is not what they want to hear. Information may not always be received in the same way you intend, so you need to evaluate from multiple view points before sharing.

Ultimately the experience has been challenging and rewarding. I have learned a lot about myself, overcome many hurdles, and pushed myself into many areas I never thought I would.

File streams with PHP

2025-03-13 08:00:00

File streams allow you to interact with a data source without having to load the whole resource into memory. This avoids hitting memory limits when working with larger files.

Let's look at a basic example of reading a file from disk

// Open the file
$resource = fopen('/path-to-my-file.txt', 'r');

// Read contents into a variable
$contents = stream_get_contents($resource);

// Close the file
fclose($resource);

It's important to remember to use fclose() once finished with the stream. Otherwise the file descriptor will remain open for the lifetime of the process. PHP will automatically close all file descriptors at the end of the request when using php-fpm or mod_php. However in CLI and other SAPI environments such as swoole, this will leak memory and eventually cause your PHP server to be killed by the OS.

fopen() may be used to interact with data from many resources. These include files from disk, http(s), and in-memory storage. For a full list of supported protocols, see the php documentation

Modes

You may have noticed the r used in the fopen('/path-to-my-file.txt', 'r') call. This specifies the type of access you need to the resource.

There are several modes, typically r (read) and w+ (read and write) are the most commonly used. See the php documentation on fopen() for more information.

Filters

Filters can be used to modify data when reading or writing to the stream. PHP has several default filters including base64, rot13, upper and lower case transforms. You may also register your own custom filters.

// File contains: VGhpcyBpcyBteSBzdHJpbmc=
$stream = fopen('/a-base64-encoded-file.txt');

stream_filter_append($stream, 'convert.base64-decode', STREAM_FILTER_READ);

// Output: This is my string
print stream_get_contents($stream);

fclose($stream);

You can also use filters to modify data when writing to a stream

$stream = fopen('/my-file.txt');

stream_filter_append($stream, 'string.toupper', STREAM_FILTER_WRITE);

fwrite($stream, 'this is my string');

fclose($stream);

// File contents: THIS IS MY STRING

IO stream aliases

PHP has a number of io stream aliases to easily interact with common resources. These include stdin, stdout, and stderr when in a CLI context, but most useful are the memory buffer and temporary files.

$buffer = fopen('php://memory', 'w+');

$temp = fopen('php://temp', 'w+');

$stdin = fopen('php://stdin', 'r');

$stdout = fopen('php://stdout', 'w+');

You can find more about these in the php documentation

Seeking

So far my examples have shown reading or writing the full file contents, however it is possible to read/write parts of the resource, as well as seek forward and backwards.

The fseek() function allows you to move the current cursor position in the file to a specific point. rewind() will move the cursor back to the beginning of the resource.

$stream = fopen('my-file.txt', 'r');

// Read first 4kb of data
$data = fgets($stream, 4096);

// Seek back to the beginning of the file. rewind($stream) achieves the same
fseek($stream, 0);

Beware! Not all streams seekable and may result in an error when attempting to do so. You should first check your stream is seekable. Non-seekable streams include sockets, pipes, and callback/pump resources.

$stream = fopen('my-file.txt', 'r');

$meta = stream_get_meta_data($stream)['seekable'];

if ($meta['seekable'] === true) {
rewind($stream);
}

$data = fgets($stream, 4096);

PSR-7 HTTP Message Interfaces

You might be asking "What do HTTP Messages have to do with file streams?".

While PSR-7 is predominantly about HTTP payloads, the StreamInterface provides a simple uniform implementation when working with any file stream.

$resource = fopen('php://temp', 'w+');

$stream = new Stream($resource);

if ($stream->isSeekable()) {
$stream->seek(0);
}

$data = $stream->read(4096);

$stream->close();

There are several PSR-7 implementations, but mostly I use guzzle/psr-7 as it comes bundled with the guzzle http client.

More details on PSR-7 can be found on the PHP-FIG website

Guzzle Utils

The guzzle/psr-7 library comes with a useful set of utils for interacting with streams. Seeing as it's a PSR-7 implmentation, as you'd expect the streams it can interact with need to be a StreamInterface

Opening a stream

Utils::tryFopen() returns an open stream, or throws an exception. When using fopen() directly, this can return false if it failed to open the resource.

$fd = Utils::tryFopen('php://temp', 'w+');

Instantiate a StreamInterface based on input

Creating a StreamInterface without knowing the underlying resource can be cumbersom. Utils::streamFor() does the heavy lifting for you

Utils::streamFor('a simple string');

Utils::streamFor(Utils::tryFopen('php://temp', 'w+'));

Utils::streamFor($generator);

Utils::streamFor($iterator);

Copy contents

$source = Utils::streamFor(Utils::tryFopen('/my-large-file.txt'));

$dest = Utils::streamFor(Utils::tryFopen('php://temp', 'w+'));

Utils::copyToStream($source, $dest);

Stream a file to the client browser

If you have a large file you want to send to the client browser, returning this as a string in the http response may cause an out of memory error. To avoid this file streams allow you to read chunks of the file and send them to the client allowing memory to be freed before reading the next chunk.

$fd = fopen('/my-large-file.txt', 'r');

// Read file and pass to the output interface. fpassthru() uses settings from the php.ini to determine chunk size when reading the file
fpassthru($fd);

fclose($fd);

Or when in a Laravel controller

return response()->streamDownload(function () {
$stream = Utils::streamFor(Utils::tryFopen('/my-large-file.txt', 'r'));

// Check for end of file
while (!$stream->eof()) {
// Read 100kb
$buffer = $stream->read(1024 * 100);

// Send buffer to output interface
print $buffer;
}

$stream->close();
})

Handling CSVs

File streams are particularly useful when working with CSV files. CSVs often contain large amounts of data and can easily run into memory problems.

My favourite library for dealing with CSV files is league/csv.

$csv = <<<EOF
id,name,email
1,Joe,[email protected]
2,Jane,[email protected]
EOF;


$reader = Reader::createFromStream(Utils::streamFor($csv));

/** @var array{id: string, name: string, email: string} $record */
foreach ($reader->getRecords() as $record) {
// Process data
}

$reader->close();

The same applies for writing large amounts of data to csv

$stream = Utils::tryFopen('php://temp', 'w+');

$csv = Writer::createFromStream($stream);

foreach ($datasource as $data) {
$csv->insertOne([
'id' => $data->getId(),
'name' => $data->getName(),
'email' => $data->getEmail(),
]);
}

unset($csv);

fclose($stream);

I don’t want tax cuts

2024-03-06 08:00:00

If there’s money for tax cuts, then there was money available for public services.

Our services are at breaking point. My county council declared that it is bankrupt for the 2023/2024 tax year. Many other councils have been doing this, more than ever before. Their first course of action was to make severe cuts to care homes and services for the elderly and disabled.

The NHS is in a dire situation, it’s a struggle to see a doctor when you need one. Waiting lists are at record highs, despite endless promises from this Tory government to reduce them.

I don’t want tax cuts, I want public services to be funded better, their employees happy, and their users happy. Make the services better for everyone. Make the infrastructure better for everyone.

I don’t want tax cuts in order to essentially pay me a dividend in return. We aren’t shareholders in a private company where the aim is the extract as much profit as possible.

I don’t need a tax cut. Tax cuts only benefit richer people, the more you earn the more the tax cut works for you. This is morally wrong. I need public services, we all need public services, transport, and functioning infrastructure.

I want my local area to be healthy, happy, clean, and well maintained.

I don’t want tax cuts.

Just give me an API key

2024-01-21 08:00:00

I've been working on adding a now page to my site, which is largely automated pulling in data from Apple Health, Strava, and last.fm.

Using the Strava api was incredibly frustrating as they force you to use OAuth. I appreciate we use OAuth to be able to authenticate multiple user's against a single application, limit access to data through scopes, and frequently rotate tokens to prevent replay attacks, but in this case I only want to access my data.

What ever happened to service's providing a simple API key? I don't want to OAuth just because I want to run a single GET query on my own data.

My Review of 2023

2023-12-30 08:00:00

2023 was by far the biggest change of my life so far. We had our first child, I lost a lot of weight, and made some big changes to my lifestyle.

Weight Loss

I started 2023 weighing at least 120kg, I don't know how heavy I actually was. I stopped weighing myself because I didn't want to see the number.

After finding out we were expecting our first child this gave me a huge kick to get my butt into action and be more active. I knew my weight would get in the way of looking after our child, and I didn't want that to happen.

I started by walking more, after a couple of weeks I'd built up to 10km a few times a week. At the start of February I decided to go along to my first Parkrun, and have been going almost every week since. I'd also started running 5k at least twice a week, including Parkrun, sometimes this would be up to 4 times. I'd also get up at 6am to complete my 10km walk before work. Some weekends I'd cycle 40km or more, other times cycling to work which was a bit under 10km each way.

One of the biggest points here was that I avoid a gym. I knew that joining a gym, I'd give up the second I stopped going, as had happened several times over the years. I wanted to make positive changes to my lifestyle that I could stick to and involve my family.

I hit a lifetime low weight of 87kg in September, and have since been hovering around 90kg. Having our first child I moved my focus to my activity levels, worrying less about losing weight, and just maintaining it. In 2024 I want to get back to losing with a goal of 80kg.

Parkrun

I completed my first Parkrun on the 4th of February, as of today, 30th Dec, I just completed my 40th.

For those that don't know Parkrun is a free timed run that takes place every Saturday at 9am. It's a 5k run, jog, walk, or whatever you want it to be. It's very inclusive and there's always a wide range of abilities taking part, fast or slow, young and old, buggies and pets.

I've made consistent improvements to my time over the year, and I'm now able to consistently run 5k in under 30 minutes. My current PB is 27:26, and I hope to get this under 25 minutes in 2024.

Our first child

We found out in late 2022 we were expecting our first child. This was a huge change to our lives, and we were both very excited.

We set about planning how we'd arrange the house, which room we'd use for the nursery, etc. One afternoon we were at the baby shop getting gifts for our pregnancy reveal, and while there we were looking at nursery furniture. We found the set we were happy with, and decided to measure up the room, that same day we ended up ordering all the furniture, pushchair, car seat, crib, the whole works. We might have gotten a little carried away.

I would highly recommend NCT Antenatal Courses. The classes were great for meeting a group of people in the same circumstance. We still keep in touch with them, and we've heard from others that even 10, 20, 30 years on they're still in touch with friends they made via their NCT group.

There were two main points we took away from NCT and put into practice.

Firstly, this is the most stressful thing most people will go through, so seeing the baby isn't free, they must support you in being the best parents you can. Visitors need to help around the house before baby time, be that (un)load the dishwasher, do the washing up, hang the laundry up, hoover, or cook dinner.

Secondly, write a list of hobbies and interests about yourself and your partner. Take it in turns once a week, or month to pick something from that list and spend time doing that thing while the other parent spends the day with little one. Spend deliberate time focusing on who and what you are, what you enjoy.

Becoming a single car household

I'd like to say the decision behind this one was a conscious environmental decision, but that's just a happy side effect. The reality is that post-covid both of us work from home so we just don't need a second car. We'd kept our second car for 2 years after last needing it, and my wife had grown tired of having to keep driving it to maintain the battery and service of the car.

Adjusting to this took a little while, during the summer I started cycling to the office if my wife needed the car for the day. Sometimes we'd double book which left us in a bit of a pickle, others my wife would have to drop me at the office before heading off to do what she needed to.

We now have a whiteboard on the fridge where we plan our meals for the week. We also write who will need the car when.

Learning German 🇩🇪

Both sides of our family have German relations, but I'd never learned any German, bumbling my way through every visit and relying on family to translate.

This year I set about learning German using Duolingo, getting to a 100 day streak. Since our little one arrived this has been put on the back burner, so in 2024 I want to pick this up again. Particularly as we want our kid to be bilingual.

TL;DR

  • Lost 30kg through diet and exercise
  • Had a child
  • Learned some German
  • Became a single car household

Looking into 2024

Going forward in to 2024 I want to set myself a few goals to focus on

  • 🏋️‍♂️ Get down to 80kg
  • 🏃‍♂️ Reduce my 5k time to 25 minutes
  • 🇩🇪 Continue learning German
  • 👨‍💻 Launch my new sass

Laravel Queue Exiting with Status 137

2023-12-12 08:00:00

We recently had a problem with our Laravel queue workers. We kept finding a particular queue job kept silently failing, not always, but often enough to cause us a significant headache.

We weren't getting any of the usual logs or exceptions we'd expect with a potential timeout, memory issue, etc. We resorted to adding some debug logging in production, profiling our memory usage, etc. but nothing was helping us get to the bottom of the issue.

Eventually, I found something useful in the supervisor logs exit status 137; not expected. Immediately I jump to it being an OOM issue, which was my suspicion, but we'd already been profiling and monitoring our usage. Nothing unexpected there, always below 100mb being used, and our limit was 128mb, the default for Laravel queue workers. Also, nothing in the syslog, or kern.log to suggest the OOM killer had been invoked.

Digging through the Laravel internals to check how the --memory flag works, I found that gives an exit code of 12 when the limit is exceeded. So not that. Back to square one we set about refactoring the job to try and reduce the memory usage, now we're under 60mb, but still, the issue persists.

We use SQS for our queue, so we have to manually extend the job visibility timeout, otherwise SQS will release the job back to another worker. So when the job did fail we'd see the expected 15 minutes delay before another worker picked up the job again. This threw us off the scent of it being a timeout issue. Usually for a timeout issue we'd expect to see the job be retried almost immediately by another worker.

Cutting to the chase we had set private int $timeout = 60 * 15; in the job class. This meant our manual SQS visibility timeout extension worked as expected, but Laravel couldn't read the timeout when it was serializing the object onto the queue. So when the job was picked up by a worker it would time out after the default 60 seconds and receive SIGKILL from the pcntl extension. This is what was causing the exit code 137.

The fix was to set the $timeout property to be public, public int $timeout = 60 * 15; so that Laravel can serialize the data onto the queue correctly.

This was compounded by the fact that PHP has some unexpected behaviour when accessing private properties of a class. If you use the null coalescing operator ?? when accessing a private property it will always evaluate to null and fallback. Usually if you access a private property you'd get an error.

None of this explains why we weren't seeing the expected timeout or max attempts exceeded exceptions, but we got to the bottom of the issue. Hopefully, this helps someone else out there.