Thursday, 30 May 2019

Mini Melbourne Step 2b - Getting the World Final Workflow

Last post we talked about the early attempts, and shortfalls of each attempt. I am not sure I am doing justice to the learning that went on to get to this final workflow, that worked for us, but to put it in some 'time' perspective, it was multiple weeks of data, attempts, research, more attempts, more data, more attempts... well you get the idea.

But, all that aside, here is the final workflow that worked for us and our data. As mentioned in the previous post, all the data was provided to me in .obj files exported from 3dsMax, in a process that I did not have, or want, direct access to.

There is this really neat, really really neat in fact, program called Qubicle, which can import almost any 3D file and 'voxelise' it. Even better if you pay for it, has an export to schematic function built in. This piece of software, shown to me by Adam is a key part in the path. However there are limitations along the way, the biggest being that you cannot take all the data and just mash it into it. The scale for this particular project, is very important, and sectioning the data was definitely a big part of the key that makes this world infinitely expandable. You see, it appears that Qubicle has this 'hard limit' on how many 'blocks' it can effectively scale. For me, it was around 256 on the vertical axes, once you started to scale models beyond that size, things started going awry. The models would be 'pulled apart' in odd ways, and the 'verticality' of the data would get all kinds of messed up, like this:


So, you can see that is a bit of an issue if you are trying to recreate a whole heap of a city. There is no way you could do it using that as a basis. So after a whole heap of messing about, working with those that held the data in 3dsMax, we solidified a size that we would work from, 200 meters squared was the most reliable.

Another interesting thing I found along the way about Qubicle, was that when you import a model, it tries to automatically scale it as big as it can, and the 200m model size, seemed to make Qubicle not mess with the scale on import, which was a huge win, and in hindsight, likely to be because 200 cannot be doubled and still be under 256. This caused us some issues with the tallest building in Melbourne, causing the scale of those sections to be slightly off, so I had to learn how to edit this particular section in Qubicle and export it in 2 vertical parts, and 'tweak' the scale accordingly so it aligned properly with the 1 block = 1 meter of the rest of the map.


Originally we were talking about 100m models, as shown in the image above, but I am glad we were able to push it out to 200m because it cut down the work required to stitch it all together by three quarters. I also, for completeness sake should tell you that I tried to stitch each of these 200m squared models together in Qubicle, but it was just too much for the software to cope with, and started causing me issues in terms of accurately aligning the models.

The time taken to render and move models around as more and more models were added was very problematic, and I am not working on a potato of a computer. On top of that, after persisting with it and getting everything to a point that I thought it was close it turns out it was unable to export such a large model to .schematic, it kept crashing when I tried and those sections I did manage to export were actually not aligned very well at all, despite appearing OK in Qubicle... 3D modelling is all about perspective, and making sure you are checking things from all of possible perspectives.

So enter the final piece of software along the way, MCEdit. To be honest, I have never been a huge fan of MCEdit in the past, not because I don't like the program, or what it was capable of, but more because I was unfamiliar with it, and while it looks like Minecraft, there is a massive difference in the way the controls work, and the way the 'player' (camera) moves and I really struggled with that.

This is why, back in the MinecraftEdu days, I loved WorldEdit so much, it was in game, powerful, and the controls made sense! But boy do I love MCEdit now, I learnt so much about MCEdit 'as required' during this project, which was an awesome way to learn actually, and I am still nowhere near an expert, not like Adrian, nope, nothing like that, but I am now proficient enough to do most of what I need to get done in a timely and what I believe is an efficient manner, or efficient enough for me anyway.

Stitching the pieces together took a lot of practice, and involved aligning each piece, very carefully. There were multiple restarts and throw away worlds along the way, but once I figured out the proper way to nudge parts around when importing, it was actually quite easy to do. It was also way easier to align the different pieces properly in MCEdit than it was in Qubicle, because Minecraft is very explicit with regards to the coordinate system, which means there are significantly less 'misalignment' opportunities.


I did realise early on, that I couldn't always align the edges of the models in a fixed grid type way, as, despite my requests to have everything in 200m square slabs you can see from the map image provided above, that if you combine a 2x2 grid of those squares together, that not all would end up square, nor 200m. So I had to devise a plan regarding finding the corners of each 'square' as it came in so that I could more easily align the pieces, 28 in all. What I ended up doing was 'marking' each corner of the piece after I imported it with a different block so that I could more easily align the adjacent pieces.


See that very faint blue spot in the middle of the gray sea of stone? Yep, that is my marker! Now, as you can clearly see the big downfall of all of this work was that, while all of the data was now in Minecraft, it was all made of stone, and well, didn't look much like Melbourne at all. We did explore texturing the models in Qubicle, as the obj file exports I received had material files associated with them, but we decided 'translating' those materials across into Minecraft blocks just was not worth the time it would take to do properly, when we could texture the buildings in MCEdit for 'broad strokes' and in-game for the finer details.

So that wraps up the 'getting the data into' Minecraft aspect of this series, next up, will either be how I started detailing the world, or heading back to the next stages of development of the Dig Experience, I am not really sure which I will feel more like writing about, since they were concurrent processes at this early stage, but stay tuned, and as always thanks for reading, feel free to drop a comment below.

Wednesday, 22 May 2019

Mini Melbourne Step 2a - Getting the World Failed Attempts

Alright, so in the last post we talked about how we started looking at this project from the perspective of the 'Dig Experience' and that the building of the city was a secondary objective. These two 'projects' of course were in 'co-development' as the dig would be 'set' within the Mini Melbourne world. However, once we had nailed the basics of the dig, and determined that it was going to be possible, we looked at what portion of the city we were going to get into Minecraft. We knew we had 3D data of the entire city, but choosing a smaller section and testing out the process for getting that data in was important.

This map shows the initial planned data for import, and a small subset of that was given to me to do all sorts of testing and investigations on. The slab I was testing appears to be from St Paul's Cathedral along Flinders Street (north side) towards the east.


We were not in uncharted territory, and many people have their own unique workflows to bring data, big data, into Minecraft. So, I turned to the community and two of my favourite online Minecraft experts, Adam Clarke and Adrian Brightmore showed me how they get 3D data into Minecraft, in multiple ways. Without those two and their expertise, the project would have been a lot more difficult. So, I thanked them back then, and I am going to do it again right now. Now that the project has been released into the wild I am so pleased with how it turned out, and that we actually managed to get the project over the line.

Without the willingness of both of these amazing guys, I am not sure it would be as big as it currently is, and as easily expandable as I know it is currently. With all the sincerity I can muster, thanks guys, this project started with freely shared community knowledge, and ends with a freely available resource for the entire world to use and explore, create and share across all major Minecraft platforms.

So after my discussions with Adam and Adrian, I went back to the people that had the data, and we talked about what sort of format they had the data in, and what sort of format they could easily get it in. It took us quite a bit of time to nail it down, but in the end we now have a workflow from their data, into my hands, in such a way that I could easily bring it into Minecraft world, that was easily expandable for when we want to include more of the city in future updates.

So, how did we do it? In this post, as the title implies, I will share the explorations I started with, and what issues these workflows caused, and then in the next post we will talk about the actual workflow. All of this data was given to me in .obj format, exported from 3dsMax, the process of which is a 'black box' that I know very little about!

First attempt: TinkerCad; Freely available, and something I have used before, so of course I started here. Interesting how you always start from an area of most comfort! Unfortunately for my comfort levels, I very quickly found that TinkerCad was not all that great at dealing with 'large' or 'detailed' models, fantastic for the smaller stuff I have done in the past, but not a great prospect having to neaten all the holes and everything in the large detailed city models I had access to.


As you can see there are holes in the terrain, entire missing sections of roof and all kinds of things I would have to neaten up in 'post-processing', and given the scale of the data I was using, probably not going to be a great option moving forward in terms of scaling the project in future and ease of use. I also decided at this point, that we were going to have to 're-north' Melbourne before we brought it into Minecraft. As you can see, all the buildings are 'angled' and it just looks 'jagged' for lack of a better word.


This is the same Tinkercad import, only 're-northed' by around 21 degrees. It looks way, way nicer, but you can still see the missing parts are still very prevalent.


Next attempt: VoxtoSchematic; Adrian has a wealth of knowledge about MCEdit and a massive collection of filters for doing all sorts of crazy things with it. He pointed me in this direction: http://www.brightmoore.net/mcedit-filters-1/voxtoschematic so I began exploring. It required me to import the obj data into a voxel program, I used Magicavoxel (https://ephtracy.github.io/) to get started, because... free... and we were only exploring at this stage. This allowed me to 'paint' portions of the data a particular colour and then export for the VoxtoSchematic to work with and VoxtoSchematic would convert those colours to specific Minecraft blocks.

A truly brilliant piece of software by Adrian, and Magicavoxel is pretty darn cool as well, but the scale quickly became problematic, as I had no way to control what 'size' the model was converted to schematic at. EDIT: Found my notes (finally) and Magicavoxel also had a serious limitation on the number of blocks it could deal with, 126 blocks cubed. This was the first point where we were seriously looking at how much data is too much. We really needed to find a balance between software limitations, and also time limitations. As in, we could easily export each building individually, but stitching that much data together at the other end would be way too time intensive to be sustainable. The great thing about this process is that the models imported quite well, neatly, with way less of the fidelity loss we saw in TinkerCad. However, the scale was all over the place and I couldn't easily find a way to manage that effectively.


It was about this point that I understood enough about MCEdit to realise that no matter which path I took, it was going to play a big part in the creation of the final world, so we started to look at converting the 3D data directly to schematic format. I investigated FME, and some neat posts detailing how that could easily translate big data into Minecraft directly, but the path was just not clear enough, and neither me, nor the data company had the software, or the time to learn it to the depth it appeared we would need to complete that workflow, so we continued looking.

That, very quickly, summarises a couple of weeks worth of testing, trialing, researching, testing some more, researching some more and testing even more. In the next post I will share the workflow that worked for us, and our data. As always, thanks for reading, and if you have any comments, please feel free to leave them in the comments section below.

Wednesday, 15 May 2019

Mini Melbourne Step 1 - Is it even possible?

This is the first in a series, of unknown length, of posts detailing, I hope, the process I went through, the people that helped, the software we tried, what failed and why, what we software used in the end, the game mechanics and the iterations of these that all came together to create both Mini Melbourne and Archaeology Adventure. If you haven't seen Mini Melbourne, or the Archaeology Adventure... I would ask what rock you have been living under, but that could be rude, so I suggest you check out this promo video showing what it is. (EDIT: Apparently the embed has stopped working, so here is the link: https://fuse.education.vic.gov.au/Resource/ByPin?Pin=8WB7CM&SearchScope=All

Funnily enough, even though the title of this post says Mini Melbourne Step 1, it didn't really start as a project to recreate such a large portion of a city in Minecraft. It started with a question, can we, two completely separate government departments, create a piece of content for Minecraft and make it really, solidly educational and suitable not only to both departments, but also to a wide range of schools, students and subject areas?

The idea for the project, out of the three that were coined in preliminary emails, that held the most interest of the team, but oddly enough my 'least favourite' was the idea of students performing an archaeological dig. If I am being honest, and I like to think I normally am, at the time I thought they were trying to make their ideas fit Minecraft, rather than making Minecraft fit their ideas, but as always, I went into the meeting with an open mind and, well.....

I was totally wrong of course... I had no idea prior to our initial face to face meeting that there were actual archaeological digs happening a block away from me right then, as we were in discussions about this project, and those archaeological digs would be the factual basis of this whole experience. Not only making this a highly immersive experience, but an authentic one, based on real events of real sites and real people and community from our city's history.

So, after our initial meeting, it was decided the first project we would work on would be what would become Archaeology Adventure, or the 'Dig Experience' as it was called for much of its development internally. So I took the idea and started playing around with the mechanics I knew of that might support this kind of thing. I also had to explore some mechanics and make them up as I went, which has been one amazing side effect of working on this project, I have learnt so much, and I hope to share as much of it as I can, by starting at the start and taking you on a journey to the finished product, both through this series of blog posts, but also through a YouTube video series that goes more in depth about each mechanic than I could possibly do here and keep your attention!

Anyway, back on topic! I created a quick and dirty mockup of how I expected a dig might work to show the team. That constituted the following basic mechanics;

A security pass, and check in system to actually get onto the site.



The raw and rough mechanics for randomly 'finding' items in a given area.


A way for students to choose how fast they would dig.


A way to 'uncover' what the found artefact was (analyse it) rather than just 'know' it straight away.


And, finally, a way to provide students with the information about each artefact that could support them in their goal of determining what the site was used for in the past.


Of course much of this, 'mechanically' speaking, has changed considerably over the development period, but the fundamental basis of it all is still there in some way, shape or form. I think the only things that have pretty much remained 'mechanically' the same is the security pass and check in and the analysis of items. But, this 'pre-alpha' demonstration set the scene (or at least the base of the possible scene) for what we could achieve in Minecraft for a room full of people who had never played before and really had no idea about what was possible.

With the 'raw' mechanics at least visible and planned out on a small scale, it was time to look at getting the data for Mini Melbourne itself, and that, my friends, is the topic of the Mini Melbourne Step 2 post. While you are waiting, I am sure with eager anticipation for that 'mouse scroller' (Is that the internet equivalent of a page turner?) to arrive, keep an eye on my YouTube channel for the video outlining each of these first mechanics, the reasoning and commands that make them work, and maybe even why I thought they were a good idea at the time.

As always, thanks for reading, and if you have any comments or questions, please feel free to leave them in the comments below, touch base on Twitter @EduElfie or even join our Discord: https://discord.gg/7fSQBdx.