From Codecity to the Metaverse: How will the Metaverse’s software manifest?
Codecity to the Metaverse | Metaverse's software manifest

PS: As a technician, although I hold a wait-and-see attitude towards the future of the metaverse, it does not prevent me from studying it.
To prove to someone that I bought the Oculus Quest 2 for the right way to software development, not to play games, or to play games. In my spare time this week, I developed a performance analysis tool for Coco that the Inherd development team had built before, and simply developed a new interactive ~~front-end interface~~: Codecity. Oh no, Codecity is a system refactoring and analysis tool for the Metaverse.
After having this simple demo, I was able to prove to someone that my Oculus Quest 2 was well spent – at least one article and one open-source software.
After writing the MVP version of Codecity, I thought about the interactive form of the software again. The last time I thought about it was about 6 years ago when I was playing with Leap Motion, Oculus Rift, Google Glass, and other related devices. The conclusion then was that now is not the time. So, is now the right time?
Idea: Start with 3D Codecity
To try a new field, it is natural to start from where you are good at, and Codecity is such a suitable scene. When I first came across it, it was the book “Your Code as a Crime Scene” recommended by the company boss. Codecity is a pattern for software analysis, written by Richard Wettel as early as 2007 .
CodeCity is an integrated environment for software analysis, where software systems are portrayed as interactive, navigable 3D cities, according to this version of his explanation. Classes are represented as buildings in a city, while packages are represented as areas where buildings are located. The visible properties of a city artifact describe a selected set of software metrics, such as CodeCrawler's multi-metric view.
We implemented a 2D version of the visualization in Coco. Since it uses D3.js as the rendering engine, the result is 2D, and its interactive capabilities are relatively limited. In order to combine the metaverse concept of lively-driven development, a 3D version of Codecity, combined with WebXR (VR + AR), is constructed – a virtual code world with various uses. As shown below:
PS: Since my Oculus is still on the way, I cannot guarantee the effect under XR for the time being. I am using the WebXR emulator on Firefox to simulate. If there are corresponding tools, you are welcome to try PR.
The good thing is that it is said that the performance of WebGL is close to the native application on the GPU, otherwise, Codecity may not be so practical, online Demo: https://phodal.github.io/codecity/ .
For writing Codecity, it is very simple at the code level, and any front-end with a little graphics experience can complete such functions:
- D3.js simplifies graphics and numerical calculations and generates graphics through Treemap.
- Three.js builds 3D worlds and adds interactive prototypes
- GitHub Actions, automate CI and CD.
- Test, TBD.
For those who have experiences in game development, such as those who have used Unity and Unreal Engine, such applications are simpler. The only issue is that they must pay a fee. By the way, I think the payment model for this type of software should undergo some changes in the future, such as the adoption of ecological co-construction.
With this MVP (Minimum Viable Product), we are on the edge of the meta world.
Digital twins: replicating objects from the physical world to the virtual world
Before we enter the metaverse, we still need to have a rough idea of how to go from the physical world to the virtual world. A very interesting entry point is digital twins. Years ago, this set of related concepts was mostly in the IoT domain, when the main entry point was device twins – I’ve also translated a set of related articles.
Azure IoT offered a device twin service in 2017 that allowed users to
- Store device-specific information in the cloud.
- Report current status information from the device application.
- Between device apps and backend applications,
- synchronize the status of long-running workflows.
- Check the information, configuration, or condition of your device.
Eclipse also built the Eclipse Ditto framework for building “digital twins” of software patterns.
To put it simply, in order to let the surrounding devices enter the virtual world, we need to carry out “virtual world-oriented” modeling. Whether it is for building a 3D model of the physical world or a networked design of objects, it can be built through existing mature software.
Based on this model, Codecity is barely a code analysis tool in the metaverse, mapping the physical properties of code: number of lines, frequency of changes, hierarchy, etc. to the virtual world. The only thing that still needs to be improved is the interaction and the “physical world” that maps the virtual world back to the code.
Virtual worlds: from flat software to object modeling
In a sense, the smart home devices we use have certain characteristics of digital twins. What is still missing is the need to model the properties of these appliances in the real world to map into the virtual world.
After this, we also need to model the house to correspond to the virtual world. It’s not an easy task, as we also have a range of household items that don’t carry any electronic properties, and it also relies on us to model the device in real-time. This kind of real-time modeling will bring a series of privacy issues – of course, if there are already cameras in the home. On top of that, it has to deal with the challenges posed by your cat.
Considering this field from the perspective of industries such as games and movies, it is quite mature. The only thing to be studied is how to quickly model the “things” of the physical world through physical equipment.
Metaverse: Virtualization and Immersion
Wikipedia’s explanation for the metaverse is:
The Metaverse is described as a futuristic and decentralized online 3D virtual environment of the future. Virtual reality glasses, virtual reality glasses, mobile phones, personal computers, and electronic gaming consoles can all be used to access the artificial virtual world.
I have never had any confidence in decentralization, even 0.000001%. To take a very simple example, in the virtual world of money, computing resources determine everything. This also means that whoever is stronger is the center. At the same time, on the other hand, every company of a certain size will launch its own metaverse, so there will be a bunch of universes. In the strength of these metaverses, computing resources still determine everything. Until the meta-world becomes the standard, a kind of infrastructure, it is centralized again.
In the above content, we already have a basic understanding of virtualization. Compared to virtualization, I believe the metaverse will lead to a better design experience. Especially as an early VR experiencer with VR dizziness, AR has brought me a better experience. This is probably the reason why Oculus has two more that hand can be used to~~play games~~experience reality. The interactive experience brought by the mouse and keyboard is not friendly. When playing games, it can bring a better experience through the handle, fitness ring, etc. So, compared to many years ago, I see more and more “success stories” proving the possibilities of immersive collaboration.
PS: By the way, as an early Leap Motion user, I prefer its device-free somatosensory control method compared to the handles of various VR devices. However, it requires faster programming capabilities and higher computing power on the device.
The pivot of the Metaverse: Immersive Collaboration
As humans as social animals, collaboration is the foundation of our productivity.
COVID-19 has significantly increased the digitalization of enterprises during the last two years. In the software development industry we are in, more and more remote office work has appeared, as well as a large number of online meetings and online collaborations. Moreover, multi-location and multi-center collaboration is also a challenge for large enterprises. Although people can get anywhere in a day by plane, the time cost of such travel is high. Today, a lot of practice shows that these travels can be transformed into online formats, the only downside is that it is quite difficult to collaborate.
So, from a personal perspective, if the metaverse is going to bring value, then “immersive collaboration” might be a good place to start
Back to Reality: Building the Technological Reserve That Generates Ideas
From the above, we have probably described a series of technologies that the metaverse needs. Back to reality, as an individual, we can consider preparing a little basic technology as a reserve. From personal experimentation, some potential points might be:
- game programming.
- digital twin. Contains relevant content required by the Internet of Things.
- Interactive Design.
- Collaborative mode.
- ……。
So, from a developer perspective, we don’t necessarily have to follow this craze. It’s just that you can consider giving yourself a certain technical reserve in the form of a lightweight hek day, such as in me:
- 7 years ago, at a Hakday event, I participated in the construction of a house-seeing robot built by the team,
- Five years ago, at the new Hakday event, we built a 3.0 version of the little robot, adding VR support.
- 3 years ago, on a pre-sales project, a demo of online VR car viewing was built.
I’m a precipitation-oriented developer, so the output of code + articles is proof of what I think reserves, such as the series of articles left at the end of this article.
And from the perspective of a CUHK organization, we can try to:
- Build an experimental team. From experimental products from companies like Apple, Amazon, Google, etc. I have seen another negative case. It may be that some domestic laboratories rely too much on business integration.
- Continuously combine business exploration. Use small successes as accumulations.
- Adopt open infrastructure.
- Learn from success stories in other fields.
Of course, the cost of acquiring a team will obviously be lower.
What will the software form of the future look like?
Finally, back to the title.
As mentioned above, the extension of interactivity is a long-standing problem in software form. In a sense, interactive devices determine the shape of software, and AI is also trying to reduce related interactions. In this way, the game between the two will become very interesting. Fortunately, this change is only to a certain extent, because people still have to eat.
In addition to the above, I think there will be:
- Cloud is everything.
- Cloud rendering. On top of that, we have more and more powerful GPUs on the mobile side, but it may not be enough for the challenges posed by the metaverse. Relying on faster network speeds, such as 5G, the cloud can complete these renderings, and the client only needs to exist as a demonstration
- Open infrastructure. For SMEs, it means: we need to use industry-standard, open-source tools, and software. For large enterprises, it is to provide an open environment and ecology, as well as the open source of the underlying infrastructure. Such open standards are more needed in the open world than at many moments in history.
- Maybe, and 6G. Although I have not used 5G, I believe that 6G will provide better digital transmission formats and algorithms for the metaverse, and can efficiently and intelligently process various metadata. 5G is aimed at faster internet speeds and many other qualities that can address a series of challenges posed by the early days of the Metaverse. However, from a personal point of view, because the network brings many uncertainties, the base station may become various core nodes of the metaverse.
- At the same time, in order to build the metaverse, more jobs will be needed, using more electricity. Therefore, compared with the above results, I think investing in a series of peripherals of the Metaverse may have more industry barriers, such as programmer training and green energy.
What do you think it will be like?
Summary
Two of my favorite fields in high school, one is the Linux kernel and one is game programming. If you compare the two, I’m far more interested in game engines than Linux kernels. As a boy, developing a game is probably the childhood dream of most people. And when online becomes a virtual world, whether it is a game or not is not so important.
Of course, this article didn’t start out with a touch of sarcasm.
Codecity is: Here
Google Play redeem code do