Coding meetups around York (UK)

Here’s a list of development, hacking or process related interest groups who meet up around York (that I’m aware of). You may see the same faces at some of these groups, York is small!ūüėČ

Agile Yorkshire (Leeds)

Agile Yorkshire meets on the second Tuesday of each month. Sometimes features coding / tech intros, but mostly focused on Agile processes and sharing experiences.

Twitter: @agileyorkshire


Code & Coffee (Leeds and York)

A spin-off from Agile Yorkshire, Code & Coffee is an early morning wake up call in Leeds and York. For those who don’t like a lie-in! Meets every Wednesday early doors.


York Geek Club

The York Geek Club Meets on the first Thursday of each month at The Habit on Goodramgate, 8pm.

Twitter: @yorkGeekClub


York Code Dojo

The York Code Dojo meets up every month, focusing on pairing up on different topics.

Twitter: @yorkCodeDojo


York Hackspace

York Hackspace meets Wednesday evenings, every week, to hack something, hopefully not someoneūüėČ.

Twitter: @YorkHackSpace


York Game Developers

The York Game Developers group meets up in a beer slipper on the second Thursday of every month.


Any more?

My Unity/Blender Notes – Working with animations

This isn’t a well-crafted blog post. Not this time – it’s just a dumping ground for workflow/findings when trying to use Blender->Unity for animations.


Adding armatures and bones for unity

Select the object

Press Shift A

Add the armature with a single bone


Name the armature (good practice)

And make sure you rename the bone too ‚Äď you will regret it if you don‚Äôt when all you see is Bone Bone Bone in your animation editors.


Select the object only, then shift select the armature (make sure you can see it)


Now we are going to parent the bone to the object.


TIP: Before doing this,¬†¬†if you extended the top face of this cube many times, you could create many bones from this single bone that are all connected (go to edit mode, click the top circle, and press E for extrude on the bone), then the above step will AUTOMATICALLY connect all your faces to the bones –¬†Instant win. Press enter to confirm its position. If snapping is on you can press ctrl to disable it temporarily while moving.

This option adds a vertex group to the object AND assigns the bone to the group ‚Äď saving you from having to do it. If you want more control over vertices you have to this yourself. Your bone name must match the name of the vertex group.

To see this association, select the cube, and enter Edit mode, then highlight Cube Bone as below, and deselect/select the vertices associated with the vertex group.


You’ll also notice that this action has added a deform with armature modifier to the cube

Now select the armature

Enter the pose mode


Press R to rotate and move the mouse ‚Äď you‚Äôll now see that the cube moves with the armature

If you want to see your armatures/bones click XRay




You might want the new bone to control different vertices ‚Äď simply create another vertex group with the same name as your new bone and assign the faces it should interfere with.


Don’t forget to name the bone


Now add the vertex groups for the cube


Using the select faces tool, select the faces that should be affected by that bone, and press Assign.

Select/deselect each vertex group to see if things are good. You may need to remove all vertices from the original vertex group if you’ve extruded faces from it, but here’s what you should be aiming for. The top should bend as you bend it.


However, the first bone we created doesn‚Äôt independently move the base ‚Äď it moves the entire object because it‚Äôs the root bone. You can extract another bone beneath it to properly react to the independent movement. Here‚Äôs another CubeBoneBottom ‚Äď and the main bone has no vertex group assignment.

You can then import the .blend into Unity.

Then drag the object into the scene and bring up the animation window.

Adding ‚Äėproperties‚Äô adds the bits you want to animate (position, rotation, scale)

You can enter big values to simulate multiple rotation as well as negative values to rotate back.

You can also edit the curves to produce different interpolation.

Click the timeline to move the keyframe position.

If you lose the animation, drag the newly created controller into the ‚Äėanimator‚Äô component.

You can also drag-select a box over the keyframes and move them to a new timing.

However, you can‚Äôt change the shape of the curves much ‚Äď so if you want elastic effects you have to do this in blender using the NLA editor, action editor, curve editor and dope sheet.

That’s a whole world of pain!


P.s.¬† I know I haven’t mentioned Actions in blender yet, but be warned…

Sometimes you want a single 3d model to contain many moving parts of separate sub-jects, as part of one animation. That’s many armatures under one action – however, blender doesn’t like this one bit. So to do this save each action as a new file e.g. model@action.blend and tie the actions together in Unity.

If you only have one armature with many bones – you’ll be fine. (e.g. a character or single set of swinging doors.




I wrote a book on Open Source Software Licensing, but why?

One part of my role, over the last 15 years, has been to police the flurry of software components developers want to leverage. In response to that I developed internal systems, processes and gained experience of interpreting licence agreements. I’ve always been trying to keep on top of such things because I didn’t want to be distracted with any repercussions. I just want to get back to code.

During a recent customer deal, my experience expanded when one client insisted that as part of a software escrow deposit, any hint of open source contamination could trigger code exposure to the customer for investigative purposes. This swiftly followed by a highly elevated level of scrutiny over our codebase; leading us to outsource independent verification of our findings. We wanted to have confidence that we have taken all reasonable and diligent steps to protect ourselves and our assets. Being competent, confident and responsive to 3rd Party component usage information requests have also made further deals and engagements progress more smoothly for all involved.

Over the last year, I bought a number of books on Open Source Software Licensing to solidify my knowledge. It quickly dawned that not many provide pragmatic advice on how to start taking control of the problem, or how to manage it day-to-day. They are often filled with more information than (I) needed.

When I work, every minute is precious –¬†I hate being inefficient or ineffective, and I’m sure others do too.¬†So I decided to write a book that was going to get straight to the point – helping people like me who really just want to get back to coding as soon as possible.

I thought it was going to be easy РI was wrong.

At first I wanted to focus on the pure pragmatic advice, and forgo dull introductions, build up, terminology. But it soon became apparent that I needed to build some foundational knowledge in the reader before laying out my experiences. This started to pollute my vision but felt a necessary evil; and I suddenly understood why many books do this. Everyone needs to be on the same page (pun by design). I also wanted a quick-fire Q&A section, but realized that it would mainly be a copy & paste of the content in the book, as it was already fairly minimal.

I was writing during the evenings and most weekends, and I soon realized that writing on such a topic seemed hard going. Every time I wrote a sentence I doubted what I was saying was fact, and¬† believed it was just my crazy opinion; so I then had to research and find corroborating support from books or online articles. This obviously slows down the process while adding legitimacy, and protecting me from embarrassment. Further slowdowns were caused by writer’s block, or as I like to call it word blindness and¬†imposter syndrome.

During the process I discovered many articles and useful¬†web sites I just had to try and include in the book. Unfortunately that broke the planned structure. In the end I rewrote it about fifty five billion¬†times – give or take one. Rewrites and reorganizations take time, and you also then get confused with what you’ve reviewed and if you’ve already talked about a topic. Rewrites also introduce more grammatical errors, something I’m already terrible at.

Eventually I just decided to stop worrying about it, dump all the content into the book; then go through linearly (many times, in both directions) to make sure I hadn’t stated the same thing twice (or at least not verbosely). Then streamline some more, making sure the pragmatic¬†ethos is¬†applied. After drawing some pretty diagrams¬†and formatting headings, adding¬†emphasis on the correct words and getting¬†page breaks right,¬†I felt ready to publish. The PDF was ready.

Little did I know that it would take another month to publish. But kind words from those I gave pre-release copies to really helped push me on. @MrRio and Laura Turner to mention a few.

If only I’d have googled ‘how to write for a Kindle device’, things would have been a lot smoother. No one¬†set of formatting seemed to¬†work well on iPhone/iPad/Fire reader. ¬†After¬†many nights of pain, I had to drop Apple Pages, and use Microsoft Word. All the beautiful formatting and indentations were lost, artwork had to be redone. The Amazon cover page creator is seriously rubbish too, I started to lose the will to live.

It almost got to the point that I was going to publish regardless of how scraggly it looked,¬†and never read it again. After telling my friend Amelia, a professional copy editor, about my Kindle woes – she just constantly nodded; as if¬†she’s¬†heard it all before.

I’ve sworn not to write a book ever again, the process has been so traumatic.¬†Not something I enjoyed, but I don’t regret doing it. After-all, It’s kinda cool to say I’m an internationally published author!

As for sales, I’m not going to retire anytime soon. At the time of writing I’ve had one purchase in Japan, one in North America and 70 normalized pages have been read through the Kindle Subscription services. I knew this was going to be¬†a niche market; so no surprise there!

I do have some plans to neatening the book up a little; for a start the bibliography and references section needs help РI was largely at the mercy of Microsoft Word->ePub conversion, and I gave up the fight.

I could also do with adding recent events regarding the Oracle Java API case, and elaborate more on some of the popular licences in play and their associated risks/permissiveness.

I’m dreading the first reviews – it’s been a rough experience and reviews can be quite brutal. I just hope the audience appreciates I’ve tried to respect their time by keeping it minimal and pragmatic¬†– Given Kindle pays per page read, I’m not surprised there’s some oversized books on the shelf!

Here’s the book in all it’s glory, it’s fun to see what it looks like in Japan’s store..

Amazon US

Amazon UK

Amazon JP










GameDev Diary ‚Äď SpaceLife #7

It was all going so well.

I placed some pre-canned planets procedurally, as well as making planets farthest from the sun be gaseous. Then I decided to be a fool and turn on lighting. Producing some lovely results as you can see below:


Yay! That looks cool.

But then I discovered something quite awkward.

The sun is in a mini, scaled down solar system 30,000 units away. So that’s where the directional lighting is.

The ship and asteroids are NOT in the same place. So we’re actually getting the right shadows by luck rather than by judgement. If I warped behind the sun, we’d get the shadows on the wrong side because it’s all been faked. We will also find that shadows of planets will only fall on other planets and not on the ship or asteroids, because the light is coming from a different direction and length (we won’t get the correct cone of shadow).

I’m not 100% sure what to do about this. I need to prevent light bleed from both locations, yet represent the light in both locations.

If I create a fake light in my ‘local space’ and curtail the light effects of the mini solar system so they do not bleed into us, then we still won’t get shadows falling from planets.¬† That will manifest itself when we start to add space-stations. If a space-station is near a planet, and the sun is behind the planet, then the space station’s rear should be dark too. But because I’ll be faking a local light that’s not got a planet in the way – that won’t be the case.

So I either work out something clever, or I keep this simple-stupid.

Time to read up on lighting I think!


Using a point light in the scaled-down universe allows me to get an area-of-effect light but curtail it’s distance. That’s the solar system lit up nicely! You can see here that a planet further away is bright as day but the back side of the planet near us is dark.

2016-06-04 13_53_03-Start.png

However, if I were to add some asteroids or change the camera to see the cockpit, it won’t be lit up yet. Time to add another spotlight that lives in our near universe – mimicking the position of the sun.

This is achieved by a spotlight 5000m away in the +ve Z axis, pointing at us. Here’s a picture of how it all pans out.


Now the trick will be to keep the sun in the correct place; remember – the floating origin trick? Well, if we do nothing, and keep our local sun as a root game object, it will work. But you’ll see the intensity of the light get higher and higher until our player reaches the threshold for the floating origin ‘push back’. Suddenly the lighting conditions will appear to flip to a darker scene. So we have to keep the local sun at the same distance away from us, and if it’s starting position is different (maybe in future, we’ll warp to a planet the other side of the sun) it will keep in the same elevation/position relative to the mini universe specifications.

What have we lost? Shadows from planets. But in reality, they player probably won’t even notice.

After adding some ambient light, our scene looks nice, like the first images of ‘fake success’.

2016-06-04 14_34_30-Unity Personal (64bit) - SkyboxTestScene.unity - SpaceLife - WebGL_ _DX11_.png

2016-06-04 14_38_50-Start.png

Next I think I need to tidy up the current code, start thinking about how to warp near planets to get to their space stations – if they have them. One day I’ll get rid of the lame warp-up/down etc. That’s just to play around with.


GameDev Diary ‚Äď SpaceLife #6

Generate this!

It’s easy to get lost when you’re a solo developer. Now and again it’s a good idea come up for air and try and think about what your goal is, or even your critical path, and ask your self questions like “Is this a game I would play?”.

I’d like to say my goal was to make the best space game out there, but that’s not realistic. Whilst I was looking into procedural generation of solar systems and planets¬†I came across an intersesting debate. There were voices declaring that procedural generation of 18 billion stars in NoMansSky didn’t sound like fun, that the variation between systems would develop patterns and not be engaging enough. The fear is that computer generated content can be boring by the virtue that it will inevitably have constraints on the parameters involved, creativity and purpoe will be limited.¬†We humans are incredibly good at feeling repetition, even if we can’t immediately see it. What’s the point in travelling across a few thousand planets if you feel that they all are roughly the same. Maybe that’s the inevitability of the universe. It probably is like that, but even more boring. Let’s not forget how excited scientists are to sample some dust or have the mere hint of water on Mars.

Players often want some kind of journey full of delight and surprises, others are quite happy being a nomad and living away from civilization to build their own game in their heads – but surely enough real content needs to exist for this to be more than a fleeting engagement.

Of course, procedural generation takes a huge load off a development team; not having to create every minute scrap of content, so it’s perfect for a solo developer (as long as it’s kept simple). In my search for the balance of fun and procedural generation I came across this great blog post by ¬†Kate Compton. I hope you appreciate the link – it took me an hour to re-find it from yesterday!

What am I making (the generator)?

  • A star system generator, which can layout space stations, jump gates, hidden treasures, asteroid belts, resources, planets, nebulas, you name it. Ok – maybe that’s too much right now – let’s focus on just planets and nebulas.

What’s good/important about my star systems?

  • If system is a solar system, it should contain¬†a¬†sun/star (solar gives it away?).
  • Planets should move around elliptical orbits¬† centred around the biggest mass. Those orbits do not have to be on the same plane, although most probably will.
  • There may be one or two nebulas in the system with size constraints.
  • Some planets have moons, which should¬†also¬†rotate.
  • The background skybox
  • The same solar system must exist and behave exactly as it did¬†if we visit again (unless we add some catastrophic event).

What must not happen?

  • Planets must not be too close to each other
  • Objects must be limited in size.
  • Planets shouldn’t all be the same size or¬†texture.
  • Moons should be smaller than the planets they circle, and must not collide.

Ok, that’s still a lot to think about – so the first mission is just to get the central star procedurally generated, and the skybox.

Universe coordinates

To start procedural generation, we need to have a deterministic algorithm based on a fixed set of values. We always want the same output for a specific input. This prevents us having to store lots of detail, configure scenes, test, you name it.

I imagine my starting star system is a box at coordinate 0,0,0.

Imagine there’s other boxes¬†above (0,1,0), below (0,-1,0), left (-1,0,0), right (1,0,0), forward (0,0,1)¬†and back (0,0,-1). This can be the¬†key for all the detail we¬†need in the universe. Of course, this doesn’t mean we can only move in 6 directions, there’s nothing stopping diagonal movement in the game. I may decide that some kind of routes are available through the universe, based on the availability of jump gates, ¬†and that we do not expose all of these boxes to the user. But i’ll leave that alone for now – see how easy it is to get lost in your own thoughts.

If I wanted to make the player’s universe more sparse at the edges or center, I could use those x,y,z values to influence the cap on numbers of planets, or other features.

e.g. if x,y,z fall in the range of 5 to -5, we are in ‘high security space’, where bandits do not appear, and as we leave that zone the percecntage chance increases. If we have a huge universe, then we might have a more complex algorithm that used sine waves, or use simplex noise to produce interesting waves of security, which then inform what type of ships, alliances, predators are present.

But for now, I’ll just use those universe coordinates to populate some data that I’ll use as a starting point for my skybox and solar system.

Here’s the code to generate a deterministic set of numbers from vector coordinates. This gives us 20 bytes (values 0..255). We can isolate individual bit patterns for simple on/off decisions, or use the modulus operator to select an item out of the array. If we wanted to we could combine four bytes to make a large 32 bit integer.


This algorithm only gives us 20 bytes, but what if we need more (we will). Simply add on some text to the base string and generate another hash. e.g.

planetData = baseData +¬†“Planet”;

Then call ComputeHash() using planetData.

I think that’s enough blogging for now – I’m going to get my head down and bash out the nebula, orbits¬†and planets and see what problems arise. Till next time.













GameDev Diary – SpaceLife #5

Having failed miserably at getting close to planets, or having huge planets¬†without introducing flicker through Z buffer issues, and seeing the efforts involved¬† in the Kerbal Space Programme (KSP), I think that for now my time is best spent elsewhere. For now my scaled down 3d skybox can still yield fantastic results, and for now the distant camera will not move.¬†I may bring back camera movement¬†in the 3d skybox at a later date, but I don’t want to define playable game areas nor have ugly turn around! warnings.

From memory, the game EVE (when I played it)¬†allowed the player travel unconstrained around a system, but any planets or nebulas remain static. They can warp between jump gates, and to stations, and I don’t think planets got closer. If they did, it wasn’t massively so. You could also probe parts of the system and warp to them, still with the same scenery and it didn’t feel wrong. If I want the user to go to planets, I’ll use space stations or specific warp sequences to achieve it, and maybe like StarWars galaxies, have constrained game areas.

Regardless of whether I move or scale planets, I still need to follow the KSP floating origin mechanism and move any other game objects including my skybox allowing physics to operate cleanly by keeping my player at 0,0,0.

Well, that was pretty easy.¬†Although¬†if I remember, the floating origin scripts I’ve seen also do something funky with particle effects.



I’ve had time to add one of the standard unity flare assets to make the sun look sparkly too. Although because the light is in the distant scene, it’s not illuminating my ship – a problem for another day!


GameDev Diary – SpaceLife #5

Planet Hacking – scaling objects based on distance

Continuing my experiments into creating large planets, I came across this post where people are simulating large scales by resizing objects based on the distance away from it.¬†I couldn’t get the code in this post to work because I’m still moving the camera and not the world – But I created my own implementation based on the theory, using the downscaled 3d skybox containing the distantCamera and distantSun.

Scale based on distance.png

If you haven’t noticed already, there’s a glaring error with my maths –

  • as you get closer to the object the scale isn’t linear. At a starting distance of 40, if we are 20 units from the planet, it’s scale is multiplied by 40/20 =2, but if we only go 10 closer, we are 40/10 = 4. Are we really twice as close? If we are 2 meters away, the planet is 20 times bigger, and one meter away we are 40 times bigger. That’s not right!
  • Even if the maths were right, we are doubling up distance movements – by scaling AND moving objects. The edge of the object has moved towards or away from us in addition to the movement.

If we want to move around planets, we have to be able to move the camera (or world) around us, so I have to ask the question to myself, why am I concerned with scaling objects by distance? What problem am I solving?

The problem lies that within the mini-skybox, all planets have been downscaled, and when we move around, we need them to be scaled up at runtime. This has two advantages – one we can arrange our solar system with ease (you can’t easily move items that are huge, or even fit them into the scene), and we give the perception that moving¬†1/100000 unit feels like moving 1 unit. So, back to the drawing board on the maths – trying to take into account a linear progression but also the scale of the camera movement…

I’ve worked out what was wrong with the maths, but there’s another problem – Even if¬†I solve the scale ¬†issue – Unity just can’t handle it. If I want really large planets, drawing really large objects isn’t the way forward. If I have a scale of 1.6m and a distance of 800,400 and a camera far clipping plane of 1000, I can see the planet, but it flickers really badly. Obviously I’m not the first person to try and tackle this problem, and it looks solvable if you know what you’re doing. I currently do not!

The most interesting posts I’ve found so far on this subject¬†are here:

And this awesome video

Either this will inspire or destroy me!