Search this blog

Showing posts with label The industry. Show all posts
Showing posts with label The industry. Show all posts

04 March, 2023

Hidden in plain sight: The mundanity of the Metaverse


Don’t you hate it when words get stolen? Now, we won’t ever have a “web 3”, that version number has been irredeemably coopted by scammers or worse, tech-bros that live a delusion of changing the world with their code, blindly following their ideology without ever trying to connect to the humanity code’s meant to serve.

Well, this is what happened to “the metaverse”. It didn’t help that it never had a solid definition, to begin with (I tried to craft one here), and then the hype train came and EVERYTHING needed to be marketed as either a metaverse or for the metaverse.


The straw that broke this camel's back...

The final nail in the word’s coffin fell down when notoriously, a big social networking company, looking at the data on its userbase and monetization trending down, decided it was the time for a BOLD move, stole the word, and decided to rush all-in making huge investments in all sort of random things that looked metaverse-y, just throwing in the trash the innovator’s dilemma and its solution.

But if I told you that, hidden in plain sight, this idea of the metaverse is actually rather obvious, even mundane, and all you need to do is to sit down and observe what has been going on… with people.

Trends in the gaming industry.

I’m not the best person to wade through the philosophy and psychology of entertainment - how it is fundamentally social, interactive, and important.

And neither I am, even in my field, a historian - so I won’t be presenting an accurate accounting of what happened in the past couple of decades.

I hope the following will be mundane enough that it can be shown even through an imperfect lens, and for familiarity’s sake, I’ll use my own career as one.

I have to warn you: this is going to be boring. All that I’m going to say, is obvious… it’s just that for some reason, I don’t see often all the dots being connected…

Let’s go.

I started working in the videogame industry in the early 2000s. The very tail end of the ps2 era (I never touched that console’s code - the closets I came was to modify some og xbox stuff we were using as we repurposed a rack of old consoles to help certain data bakes), right at the beginning of the 360 one.

box art
My first game (uncredited)

What were we doing? Boxed titles. Local, self-contained experiences. Yes, you could play split screen if you happened to have a friend nearby - and that’s incredibly fun, we are social animals after all… 

But all in all, you shipped a title, you pressed discs, people bought discs, inserted them in their console, played on the couch, rinse and repeat.

I did a couple of these, then moved from Italy to Canada, to work for EA, a much bigger company, we’re around the middle of the 360/ps3 era now.

What were we doing? Yeah, you guessed it, multiplayer titles. Single-player was still important, local multiplayer was still important, and we were still pressing discs… but we started to move towards a more connected idea of gaming. 

Is Fight Night Champion Good? Revisiting the Boxing Game 10 Years Later
You know I'm still proud of the work on this one...

We would do DLCs, and support the game longer post-shipping; Communities started to grow bigger as you could connect around a game.

The game you got on disc was not that relevant anymore, was just a starting point, necessarily. There is no way to game-design something that will be played, concurrently, by millions of players. They will break your game, find balancing issues, and so on, so really, the game code was made to be infinitely tweakable, in “real-time” by people monitoring the community and making sure it kept being fun and challenging…

Gaming has always been a community, with forums, magazines, TV shows, and such, but you start seeing all of that grow, people staying with a game longer, sequels to be more important, franchises over single titles…

What’s next? 

For me, Ps4/Xbox one, Activision, Call of Duty… Where are we going? E-sports, twitch, youtube. A longer and longer tail of content. 

Modern Warfare 3 live action trailer brings Hollywood to Call of Duty » EFTM
 I do miss the live-action, star-studded fun trailers COD used to make...

We go beyond tweaking the game post-launch, now, really the success of a game is measured in how well you keep providing interesting content, and interesting experiences with that framework you created.

Games as a service, we see the drop in physical game sales, the move to digital distribution - and with it, the boom of indie game making, of the idea that anyone can create and share.

Even big franchises, with their tight control over their IP, are nothing without the community of creators around them. Playstation “share” et al.

Call of duty is not simply the game that ships in a box, it’s a culture, it’s a scene - a persistent entity even way before it was a persistent gaming universe (only recently happening with WarZone).

And then of course, I moved to Roblox, where I am now - and I guess I should have said somewhere, this is all personal - it’s my view of the industry, not connected with my job there and the company’s goals (Dave started from an educational tool, and from there crafted a vision that has always been quite unique, arguably the reason why now it ended up being ahead, clearer etc...). 

NoisyButters - YouTube
I like the positivity of NoisyButters

Hopefully, you can see that my point here is more general than what this or that company wants to do...

But again, I moved to Roblox, personally because I liked the idea to be closer to the creative side of the equation, but in general, where are we now? 

What’s the new wave of gaming? Fortnite? Minecraft? Among us? Tarkov? Diablo 4? Whatever, you see the trends:

  • Games are social, and encourage socialization, they are communities. Effectively, they are social networks, just as clubhouse, instagram, tiktok…
  • There are user-created universes “around” the games, even when the game does not allow at all UGC.
  • Games live or die based on the supply of content "flowing through" them. They are vehicles for content delivery.
  • The in-game world and real world have continuous crossovers, brands, concerts, events, celebrations…
Why do I play D3? For the transmog fashion of course!
And if you haven't played Fortnite / experienced its immense catalog of skins, you're missing out.

Conclusions.

Yes, all of this has been true in some ways since forever, in a more underground fashion. 

MUDs and modding, Ultima Online and Warcraft, ARGs, and LARPing, I know - nothing's new under the sun. But this does not invalidate the idea, it reinforces it, everything that is mainstream today has been underground before...

So, are we surprised that “the metaverse” matters? The idea of crafting the creative space, making a platform for creativity, having the social aspect built-in, to go beyond owning single IPs? To make the youtube of gaming, to merge creation, distribution, and communication? To allow people to create, instead of trying to cope with content demands by having everything in house, in a continuous death march that anyways will never match what communities can imagine?

I have to admit, a lot of ideas I see in this space look incredibly dumb. The equation that the metaverse is AR/VR/XR, that is the holodeck or ready player one, whatever… and look, one day it might even be, in a time horizon that I really don’t care talking about.

Innovation dies where monopolies thrive: why Meta is failing at metaverse |  Cybernews
:/

But today? Today is mundane, it’s an obvious space that does not need to be created, it’s already here, in products and trends, and will only evolve towards more integrated platforms and better products and so on - but it is anything but surprising. 

It’s not science fiction, it’s basic humanity wanting to connect and create.

19 April, 2020

The Technical Interview [LEAKED]

Another small document made for the rendering team at Roblox (previous leak here).

I believe in sharing knowledge, and I don't see, from videogame/realtime rendering teams many talking publicly about interviewing practices. I hope that it can be of some help.

Download here.

P.s. these notes are about principles, and they can be implemented in many different ways - they won't really give much insight about the structure of the interviews we do. For that, you should come and interview with us :)

29 August, 2019

Engineering Career Guide [LEAKED]

Q: How do I progress in my career as a (rendering) engineer?
A: Start from here. <== DOWNLOAD LINK

I hope this helps. It's not comprehensive and I did remove some bits that are specific to us (they shouldn't matter anyways), it's meant to be a starting point for discussions.

I could have written a separate blog post but in the end it would have been a rehash of the same ideas so I just decided to spill some beans...


Not the download link! Just a Roblox monster.

18 July, 2019

What makes "10x" engineers. A complete hypothesis.

This happened on Twitter recently:


It's a long thread and I censored the author's name because it doesn't matter and I don't want to add to the hate that already naturally happens on any social media these days. 
To be honest, I'm still not entirely sure it's not just a parody, but I think it isn't. Regardless, it seemed a good excuse for a blog post.

Let's for a moment forget about how good or bad the term is. I don't particularly love discussions around words. I've never used "10x" in my career and I don't find it particularly great (also because it's linked to a certain start-up culture that I don't particularly enjoy), but I don't really want to open that can of worms.

What's wrong with a description of a "10x" engineer like the one above? 

I hope it's obvious but I'll spell it out. It tries to describe an engineer not in terms of the work they do, the actual output and results, but in some kind of mystical terms.
Like you can infer people's skills by looking at how they dress, what they eat, what time they come to the office and so on.

It doesn't help that the attributes this person uses as signals for a "10x engineer" are all kinda on the introverted side of the spectrum, to be charitable, as introversion per-se is a test of some kind of smartness. It's close to the old Hollywood trope of the computer guy being a weird, overweight, sexually frustrated white guy.

Ok, now that we closed the chapter on the critique of random people's tweets, let's get to something more interesting... 

Do "10x" engineers even exist? What are they? And what makes them?

Let's start from that last one. 

What's a 10x engineer? Typically we think of "10x" engineers as people that are much more productive than the average when working with code. Code "wizards", "ninjas", "rockstars" or whatever else cringe-worthy moniker teenagers use.

In my experience, 10x engineers do exist. Even controlling for seniority, knowledge, and skills, productivity is not uniform among people. I hope this is not controversial, one can know something, be even experienced in doing a given thing with a good track record, and yet not be as effective at doing it as others.

Should you hire only "10x" people? Definitely! Sort-of. In a way... We all look for excellent people, of course, and being able to distinguish good from great among a given seniority level is certainly important.
That said, there are lots of ways to be excellent beyond coding. In a decently-sized team other aspects might even be more important, mentoring, coordination, project management and so on. As things grow code usually tends towards being an implementation detail, so to speak, secondary to product and people concerns.

Even if we just look at coding, there are lots of kinds of engineers, people who are great at handling huge, foreign code-bases, people who are great at fixing things, people who are great at creating new things, people who are great architects and so on...

Lots of things in which you can be "10x" - but still, the concept of productivity being separate from skill generally holds.

These multipliers are also of the hardest to assess in interviews because again we're saying it doesn't correlate with simply what a person has on the CV or their ability to answer technical questions. Correctly characterizing where this productivity comes from is thus of utmost importance.

Why are some people more effective than others, given the same skillset? Where does that effectiveness come from? Is it their choice of editors? Their typing speed? Some sort of flow-related supernatural focus? I think not.

First Hypothesis: Output = Skill * Effort * Allocation

Skill is knowledge and experience, what we usually mostly correlate with seniority levels. As an analogy, I would say this is the value of the chips you have, at a gambling table.

Effort, given the same workday, is mostly focus. It's a time management skill, the ability to execute your tasks in good-sized chunks. In the gambling analogy, this would be the number of chips you have.

Focus is partially environment-related, but what we don't say often enough is that a lot of it is a skill. A lot of it also relates to how much a person likes doing a given thing, how fit he is for the job at hand. Effective managers that try to hire and allocate people to do the things they are passionate about, can thus help to get the most from the effort multiplier.

The last aspect, what I called allocation. This is how you spend the chips you have. Second hypothesis: correct allocation is what "makes" a 10x engineer.

In other words, we all have a number of chips to spend each day on our tasks. And controlling for seniority levels more or less effectively capture this number, there isn't that much variance in that.
The part that has a lot of variance is the allocation. Not in what to do, as in prioritizing this bugfix to that feature (even if such skills are also fundamental and have very high variance we capture them well in job categories, think technical director versus principal engineer for example) but how we do things.

Do I use a scripting language, should I implement things in C, or maybe I should learn that fancy new language everyone's talking about? Do I rely on a library or write from scratch? Do I need to understand the overall architecture of this software? Do I need to understand the specifics of the functions I'm calling? When it's appropriate to be sloppy? Should I jump into prototyping or I need to learn about the state of the art first? Should I go deep or wide?

We always have a limited amount of resources, ability to keep things in our brain, of doing work. And software design has a lot of different dimensions, abstraction versus specificity, generalization versus integration, high-level versus low-level concerns and so on. 
The ability to navigate this design space and selecting the right tools for the job, both in terms of concrete artifacts (code, libraries, languages, IDEs) and of abstract methodologies, makes a huge difference. One thing is to know about things, the other is to be able to critically evaluate tradeoffs and allocate (your) resources well.

Third Hypothesis. The reason why "10x" skills look mystical is that we don't have a solid theory for allocation choices.

When we have a design space that lacks a solid theoretical framework to navigate it, all success looks random, unteachable, and mystical. This is why "10x" engineers are both rare and sometimes described in horrible ways like it was done in the twitter thread above.

Too much of programming is still an art, eventually some people "get it" after lots of exercise, but we don't really know how to replicate that success.

We accept that somehow of the huge talent pool of software engineers, some will somehow find a way to be productive at some things, and some will be less successful.

---

Update: I was made aware of this, which is a much more concise way of vehiculating the same message:


It's great that I'm not alone in this...

02 June, 2019

The Value of Pixels (presentation slides)


Presented at the bay area game tech meetup, hosted at Roblox offices.
If you want to be notified of future meetups, join here.




28 April, 2019

On the “toxicity” of videogame production.

I was at a lovely dinner yesterday with some ex-gamedev friends and, unsurprisingly, we ended up talking about the past and the future, our experiences in the trenches of videogame production. It reminded me of many discussions I had on various social media channels, and I thought it would be nice to put something in writing. I hope it might help people who want to start their career in this creative industry. And perhaps even some veterans could find something interesting in reading this.

- Disclaimer.

These are my thoughts. Duh, right? Obvious, the usual canned text about not representing the views of our corporate overlords and such? Not the point.
The thing I want to remind you before we start is how unknowable an industry is. Or even a company, a team. We live in bubbles, even the ones among us with the most experience, with most curiosity, are bound by our human limits. That’s why we structure large companies and teams in hierarchies, right? Because nobody can see everything. Of course, as you ascend them you get more of a broad view, but from these heights, the details are quite blurry, and vice-versa, people at the “bottom” can be very aware of certain details but miss the whole. 

This is bad enough that even if internally you try hard, after a success or a failure, to understand what went right or wrong, most of the times you won’t capture objectively and exhaustively these factors. Often times we don’t know at all, and we fail to replicate success or to avoid failing again.

Staring at the production monster might drive you insane.

So, I can claim to be more experienced than some, less than some others, it truly doesn’t matter. Nobody is a source of truth in this, the best we can do is to bring a piece of the puzzle. This is, by the way, a good attitude both towards oneself, to know that we probably have myriads of blind spots, but also key to understand what other people say and write. Even the best investigative journalists out there can at best report a bit of truth, an honest point of view, not the whole picture. 

To name names, for example, think about Jason Schreier, whom I admire (have you read “blood, sweat and pixels”? You should...) for his writing and his ability to do great, honest research. His work is exemplary, and still, I think it’s partial. In some cases, I know it is.

And that is ok, it’s intellectual laziness to think we can read some account and form strong opinions, know what we’re talking about. Journalism should provide a starting point for discussion, research, and thought. It’s like doing science. You chip away at the truth, but one single observation, no matter the prestige of the lab, means very little. 
And if we need multiple studies to confirm something as simple as science, where things are objective, measurable and unchanging, think how hard is the truth when it comes to entities made of people…

- Hedging risk.

One thing to understand is where the risk for abuse comes from. And I write this first not because it should be a personal responsibility to avoid abuse, but because it’s something that we don’t talk about. Yes, there is bad management, terrible things do exist, in this industry as in others, and they have to be exposed, and we have to fight. But that doesn’t help us to plan our careers and to take care of ourselves. 

So, where does the potential for abuse come from? Simply, imbalance of power. If you don’t have options, you are at risk, and in practice, the worst companies tend to be the ones with all the power, simply because it’s so simple to “slip” into abusing it. Sometimes without even truly realizing what the issue is.

So, you should avoid EA or Activision. Nintendo, Microsoft and Sony, right, the big ones? No, that’s not the power I’m talking about, quite the opposite. Say you are an established computer engineer working for EA, in its main campus in the silicon valley, today. Who has power, EA or you, when Google, Facebook et al are more than eager to offer you a job? I’d say, as an educated guess, that the most risk comes in medium-sized companies located in countries without a big game industry, in roles where the offer is much bigger than the demand. 

Does that mean that you should not seek a career in these roles, or seek a job in such companies? Definitely not, I started exactly like that, actually leaving a safer and even better-paid job to put myself in the above-mentioned scenario. It’s not that we shouldn’t do scary and dangerous things, but we have to be aware of what we are doing and why. My better half is an actress, she’s great and I admire her ambition, work ethic, and courage. Taking risks is fine when you understand them, you make conscious choices, you have a plan, and that plan should also include a path to stability.

- Bad management or creative management?

Fact. Most great games are done in stressful conditions. Crunch, fear, failure, generally the entire thing being on fire. In fact, the production of most great games can be virtually indistinguishable from the production of terrible games, and it’s the main reason why I advise against choosing your employer only based on your love of the end product.

This I think is remarkable. And often times we are truly schizophrenic with our judgment and outrage. If a product fails, we might investigate the reasons for its failure and find some underlying problems in a company’s work conditions. Great! But at the same time, when products truly succeed we have the ability to look at the very same patterns and not just turn a blind eye to them, but actively celebrate them. 
The heroic story of the team that didn’t know how to ship, but pulled all-nighters, rewrote the key system and created the thing that everyone remembers to this day. If we were to look at the top N games of all time, how many would have these stories behind their productions?

Worse, this is not just about companies and corporations. Huge entities, shareholders, due dates and market pressure. It happens pretty much universally, from individual artists creating games with the sole purpose of expressing their ideas to indie studios trying to make rent, all the way to Hollywood-sized blockbuster productions. It happened yesterday, it happens today. Will it happen in the future? Should it?

- The cost of creativity.

One other thing to realize is how this is not a problem of videogame production, at all. Videogames don’t have a problem. Creative products do. Look at movies, at actors, film crews. Visual effects. Music? Theater? Visual arts? Would you really be surprised to learn there are exactly the same patterns in all these? That videogames are not the “worst” industry among the creative ones? I’m guessing you would not be surprised…

This is the thing we should really be thinking about. Nobody knows how to make great creative products. There is no recipe for fun, there is no way put innovation on a predictable schedule, there’s no telling how many takes will be needed to nail that scene in a movie, and so on. This is truly a hard problem, fundamentally hard, and not a problem we can solve. By definition, creativity, research, innovation, all these things are unknown, if we knew how to do them up-front, they would not be novel and creative. They are defined by their lack of predictability.

In keeping with movie references...

And I don’t know if we know where we stand, truly. It’s a dilemma. On one hand, we want to care, as we should, about the wellbeing of everyone. We might even go as far as saying that if you are an artist, you shouldn’t sacrifice yourself to your art. But should you not? Should it be your choice, your life, and legacy? Probably. 
But then we might say, it’s ok for the individual, but it’s not ok for a corporation to exploit and use artists for profit. When we create packaged products, we put creativity in a corporate box, it’s now the responsibility of the corporation to ensure the wellbeing of the employees, they should rise to higher standards. And that is absolutely true I would never question such fact.

Yet, our schizophrenia is still there. It’s not that simple, for example, we might like a given team that does certain products. And we might be worried when such a team is acquired by a large corporation because they might lose their edge, their way of doing things. You see the contradiction in that?

In general (in a very, very general sense), large corporations are better, because they are ruled by money, investors looking at percentages, often banks and other institutions that don’t really know nor care about the products. And money is fairly risk-averse, it makes big publishers cash on sequels, big franchises, incremental improvements and so on. All things that bring more management, that sacrifice creativity for predictability. Yet we don’t really celebrate such things, do we? We celebrate the risk takers, the crazy ones…

- Not an absolution.

So tl;dr; creativity has a cost in all fields, it’s probably something we can’t solve, and we should understand our own willingness to take risks, our own objectives and paths in life. Our options exist on a wide spectrum, if you can you should probably expose yourself to lots of different things and see what works best for you. And what works best will change as your life changes as well.

But this doesn’t mean that shitty management doesn’t exist. That there aren’t better and worse ways of handling risks and creativity, that there is no science and no merit. Au contraire. And ours, being a relatively new industry in many ways, certainly the youngest among the big creative industries, still has a lot to learn, a lot to discuss. I think everyone who has a good amount of production experience has seen some amount of incompetence. And has seen or knows of truly bad situations, instances of abuse and evil, as I fear will always be the case when things involve people, in general.

It’s our responsibility to speak up, to fight, to think and debate. But it’s also our responsibility to not fall into easy narratives, oversimplifications, to think that it’s easy to separate good and bad, to identify from the outside and at a glance. Because it truly isn’t and we might end up doing more harm than help, as ignorance often does.

And yes.
These are only my 2c.

07 April, 2019

How to choose your next job (why I went to Roblox)

This is one of those (rare?) posts that I wasn't sure how to write. I'm not a fan of talking about personal things here, and even more rarely do I write about companies.

But I too often see people, especially juniors entering the industry, coming with what I think are the wrong ideas of how looking for a job works, even making mistakes sometimes that lead to frustration, an inability to fit into a given environment, and can even make people want to quit an entire industry altogether.

By far, the number one mistake I see, are people who just want to go to work for projects that they are a fan of. In my industry, that means games they like to play. Not realizing that the end product does not really tell any story of how it was done and/or what your job will be like.

I do strongly advocate to try to follow your passions, that makes working so much better. And if you're lucky, your passion will even guide you to products you personally enjoy playing. But, that should not be - I repeat, SHOULD NOT BE - your first concern.


"Airship station"
I've been extremely lucky in my career. I have worked for quite a few companies, on many games. I have almost always landed in places I love. Working on projects I love. But only once I've actually worked for a franchise I play (Call of Duty, but even there, I play the single player only, so perhaps you could say I don't really play most of that either).

So, I'll do what most coaches do and elevate my small sample set, based on my personal experience, in a set of rules you might or might now want to follow. And at the end, also tell a bit about why I'm now working at Roblox. Deal? Good, let's go.

- Know thyself.

The first thing is to know yourself. Hopefully, if you paid attention and are honest, over the years you form an idea of who you are and what you like to do, what motivates you.
It's actually not easy, and many people struggle with it, but that might not be the end of the world either. If you don't know, then you at least know you don't and can reflect that in your education and career choices.

In my case, I think I could describe myself as follows:
  • I'm driven by curiosity. I love knowledge, learning, thinking. This is nothing particularly peculiar, if you look at theories of human desire and curiosity, gaining knowledge is one of the main universal motivators.
  • My own intellectual strength lies mostly in logical thinking. I have always been drawn to math, formal systems. This is not to say I'm an extraordinary mathematician, but I do find it easier to work when I can have a certain degree of control and understanding.
  • I love creativity, creative expression, and art, particularly visual arts. 
  • I'm a very social and open introvert. What this means is that I like people, but I've also always been primarily focused inwards, thinking, day-dreaming. Especially as a kid, I could get completely lost in my own thoughts. Nowadays, I try to be a more balanced person, but it's a conscious effort.
Ok, so what does all this mean? How does it relate to finding a job? Well, to me, since a very young age, it meant I knew I would either be an artist or a computer scientist. And that either way it would probably involve computers.
That's why I was involved as a kid in the demo scene. After high school, I decided I wasn't talented enough to make a living as an artist, and I chose computer science. In retrospect, I had a great intuition, 
as even today I struggle in my own art to go out of certain mental models and constraints. I might have been a good technical artist, who knows, but I think I made the right call. Good job, nerdy teenage me!

- Know thy enemy.

What you like to do, what you can offer. This second "step" matures as you gain more work experience, again, if you pay some attention. If you don't know yet, it's not a problem - it means you can know your objectives are probably more exploratory than mine. Your understanding is something that is ever-evolving.

What does all that psychological stuff above mean when it comes to a job? Well, for me it means:
  • I'm not a ninja, a cowboy, or a rockstar. I'm pretty decent with hacking code I hope, as you would expect from anyone with some seniority, but I'm not the guy that will cruise through foreign source, write some mysterious lines, and make things work. I need to understand what I'm doing to be the most effective, and I have to consciously balance my pragmatism with my curiosity.
  • On the other hand, I'm at my best when I'm early in a project. I gravitate towards R&D, solving problems that have unknowns. Assessing risks, doing prototypes, organizing work. Mentoring other people.
  • I don't care about technology or code per se. All are tools for a means to an end. I care about computer graphics, and that's what I know most about, but I am curious about anything. Even outside computer science. So, even in R&D, I would not work in the abstract, in the five-years out horizon, or on entirely theoretical matters. I rather prefer to be close to the product and people.
I'm a rendering engineer. At least that's what I've been doing for the past decade or so. But that's not enough. There are a million ways to be a rendering engineer. I think I'm best at working on novel problems, doing applied R&D, and doing so by caring about the entire pipeline, not only code.

There are another million ways to do this job and are all useful in a company. There's no better or worse. If you know what you can offer and like, you will be able to communicate it more clearly and find better matches. We are all interested in that, in finding the perfect fit. One engineer can do terribly at one company, and thrive in another. It's a very complex handshake, but it all begins in understanding what you need.

- Profit?

Note: I don't mean that everything I wrote above is something you have to think about any time you send a resume. First of all, you should probably always talk to people, and never limit yourself. Yes, really. Send that CV. No, I don't care what you're doing, the timing, the company, just send that CV and have a talk. You never know what you might learn, don't make assumptions.

Second, it's silly to go through all this explicitly, every time you think of a job. But. If you know all this, if along the way you took some effort to be a bit aware of things, you will naturally be more clear in your interactions and probably end up finding opportunities that fit you.

"Rip ur toaster"
Ok, let's now address the last point. Why Roblox? I have to be honest. I would not have written all this if a few people didn't ask me that question. Not many, most of my friends in the industry actually were very positive, heard good things, and actually made me more confident in the choice.
But in some cases, people didn't see immediately the connection between someone who has so far been doing only AAA games and almost only for consoles, and a company that makes a platform for games mostly aimed at kids, mostly on PC and mobile, and with graphics mostly made out of flat shaded blocks. So I thought that going through my point of view could be something interesting to write about.

Why Roblox and not, say Naughty Dog or Rockstar, Unity or Unreal? Assuming that I had a choice of course, in a dream world where I can pick...

Because I'm fascinated by the problem set.

Now, let's be clear. I'm writing this blind, I actually intended to write it before my first day, to be entirely blind. My goal is not to talk about the job or the company. Also, I don't want to make comparisons. I am actually a stern proponent of the fact that computer graphics are far from being solved, both in terms of shiny pixels and associated research and even more so in terms of the production pipelines at large.
Instead, I simply want to explain why I ended up thinking that flat shading might be very interesting. 

"Stratosphere Settlement"
The way I see it, Roblox is trying to do two very hard things at once. First, it wants to be everywhere, from low-powered mobile devices to PCs to consoles, scaling the games automatically and appropriately. Second, these games are typically made by creatives that do not have necessarily the same technical knowledge as conventional game studios. In fact, the Roblox platform is used even as a teaching tool for kids, and many creators start on the platform as kids.

This is a fascinating and scary idea. How to do graphics with primitives that are simpler than traditional DCC tools, but at the same time that render efficiently across so many runtimes? In Roblox, everything is dynamic. Everything streams, and start-up times are very important (a common thing with mobile gaming in general). There is no baking, the mantra for all rendering is that it has to be incremental, cached, and with graceful degradation.

And now, in this platform with these constraints, think of what you might want to do if you wanted to start moving more towards a conventional real-time rendering engine. What could you do to be closer to say, Unity, but retaining enough control to still be able to scale? I think one key idea is to constrain authoring in ways that allow attaching semantics to the assets. In other words, not having creators fully specify them to the same level as a conventional engine does, but leveraging that to "reinterpret" them a bit to perform well across the different devices.

I don't know, I'm not sure. But it got me thinking. And that was a good sign. Was it the right choice? Ask me in a year or so...

22 October, 2017

The curse of success. Why nothing great is ever "good".

- The wedding.

A few weeks ago I flew back to Italy. My best friend was getting married, and I had to be his best man. 

The Amalfi coast is great for a wedding.

One night a few days before the wedding we were spending some time together when suddenly he starts getting phone calls from work. Some of the intranet infrastructure in this huge industrial oil and gas site he works in started failing, and even if he is not an IT expert, he happens to be the chief of maintenance for the entire site, and that entails also making sure their server room has everything it needs to keep functioning.

A few months back he commissioned a partial remodel of the room to improve the airflow, as they started experiencing some problems with the cooling efficiency of the air conditioners in the room. Fresh of that experience, immediately a fear dawns on his face: the servers are overheating and that’s why they started experiencing loss of functionality, started with emails.

He sends a technician in the room and his fears were confirmed: none of the four AC units are working, there are more than fifty degrees in the room. Two of them are completely off, the other two have their control panels on but the pumps are not working. To add insult to injury, they didn't receive any notification apparently because the email server was the first to fail. 
After instructing the technician to open all windows in the room, it’s decided that he has to go on site to follow the situation. And as I didn’t have much better to do, I followed...

What came after was a night of speculations, experiments, and deductions that you might see in an episode of House M.D., but applied to heavy industrial machinery. Quite interesting to see from the perspective of a software engineer, debugging problems in code is not quite as exciting...

In the end, the problem turned out to be that one of the phases of a tri-phase outlet was missing, and in the end the culprit was found: one cable in the power line went completely bust, possibly due to some slow decay process that has been going on for years, maybe triggered by some slight load imbalance, till in the end an electric arc sparked between two contacts and fried immediately the system.

Fried connectors.

The two units that appeared still to be powered on had their controls wired to the two working phases of the tri-phase, but even for these the pumps would not work because they require all three phases to be present.

4 am, we're back in the car going home and I was asking questions. Why things went the way they did? What could have been done to prevent downtime of an apparently critical piece of IT? Why was that piece of IT even that critical, apparently there was a mirror unit at another location. What is exactly that server room doing? It seems that obviously there would be better ways to handle all that...

And then it dawned on me - this huge industrial site has a ton of moving parts, at any given time there are many ongoing maintenance projects going on, even just monitoring them all is a challenge. Nobody knows everything about everything. Nothing is perfect, lots of things are not even good, in some ways it seems to be barely getting by, in others, it looks fairly sci-fi... You keep the machine going, you pick your battles. Certain things will rot, some stuff will be old and obsolete and wasteful, some other will be state of the art.

Which happens to be exactly how we make videogames. And software in general! I've never been in a game company where there weren't parts of the technology that were bad. Where people didn't have anything to complain about. 
Sometimes, or often even, we complain only because we easily accommodate to a given baseline, anything good becomes just the way things are, and anything bad stands out. 
But often times we have areas where things are just objectively terrible, old, primitive, cumbersome, slow, wasteful, rotten. And the more successful is the game, the more used is the engine, the bigger and better the end results, the more we risk let some parts fall behind.

- The best products are not made with the "best" tools in the "best" ways.

It's easy to understand how this "curse of success" takes place. Production is a monster that devours everything. Ed Catmull and Amy Wallace describe this in chapter seven of the excellent "Creativity Inc." talking of production pressures as the "hungry beast". When you're successful you can't stop, you can't break things and rebuild the world, there's less space for architecture and "proper" engineering.

People want what you're making, so you'll have to make more of it, add features, make things bigger and better; quicky all your resources are drained trying to chase that dragon. On the other hand, the alternative is worse: technology that is perfectly planned, perfectly executed, and perfectly useless.

Engineers and computer scientists are often ill-equipped to deal with this reality. We learn about mathematical truths, hard science, all our education deals with rigorous theoretical foundations, in an almost idealized, Platonic sense of beauty. In this world, there is always a perfect solution to a problem, demonstrably so, and the goal of the engineer is to achieve it.
The trivialities of dealing with people, team, and products are left to human resources or marketing.

Of course, that's completely wrong as there are only two kinds of technology: the kind that serves people, and the useless kind. But this doesn't mean there is no concept of quality in technology either! Not at all! But, we'll have to redefine the concept of "great technology" and "proper" engineering. Not about numbers, features, and algorithms, but about happiness and people: problems solved, results delivered, needs addressed...

The gorgeous gothic church of San Lorenzo Maggiore, a patchwork of styles

Great technology then seems not to be defined by how perfect and sparkly clean it is on the inside (even if sometimes that can be a mean for the goal) but by a few things that make it unique, and lots of hard work to keep everything in working order. 
If you are very aggressive in prioritizing the end product, inevitably how it's done, the internals, will suffer. 

But the alternative is clearly wrong, isn't it? If you prioritize technical concerns over end-user features, you're making something beautiful maybe, but useless. It's gradient diffusion: the farther you are from the output, the more your gradient vanishes.
The product and its needs are the ones that drive the gradient of change the most, the tools that are used to make the product are one step farther, they still need to adapt and be in good shape in order to accomodate the product needs, but the gradient is smaller, and so on, the tools that are made to make the tools for the products have an even smaller gradient until it vanishes and we deal with technology that we don't even care to write or ever change, it's just there for us (e.g. our workstation's OS, Visual Studio internals, what is Mathematica doing when I'm graphing something, how Outlook works etc...)

The thing that I happen to say most often nowadays when discussing clever ideas is "nice, but what problem it is solving, in practice, for people, today?".

The role of the engineer should then mostly be to understand what your product needs, what makes a difference, and how to engineer solutions to keep things going, how to prioritize your efforts when there are thousands of things that could be done, very little time to do them, and huge teams of people using your tools and technologies that can't stop making forward progress...
That also explains why in practice we never found a fixed recipe for any complex system: there always are a lot of different ways to reach success: engineering is not about trying to find -the- best solution, but managing the ride towards a good one.

- Unreasonable solutions to pragmatic goals.

All this though does not mean that we should be myopic, and just create things by getting precise measures of their effects, and thus optimize for the change that yields the biggest improvement. You can do well, certainly, by aggressively iterating upon your technology, polishing it, climbing the nearest hill. There is value to that, but to be truly excellent one has also to make space for doing crazy things, spending time chasing far-fetched ideas, down avenues that seem at first wasteful.

I myself work in a group that does a lot of research, thus lives by taking risks (if you're not scared, you're not doing research, by definition you're just implementing a solution). And most if not all of the research we do is very hard to correlate to either sales or user engagement. When we are lucky, we can prove some of our stuff saved some time (thus money) for production.

And at one point, one might even need to go explore areas of diminishing returns...

It's what in optimization is called the "exploration versus exploitation" tradeoff: sometimes we have to trust the fact that in order to achieve success we don't have to explicitly seek it, we have to stop explicitly looking at these measures. But that does not mean that the end goal stops to be very pragmatic!
What it means is that sometimes (sometimes) to be truly the best one has to consciously dedicate time to play around, to do things because they are interesting even if we can't prove they will lead anywhere. Know how to tame the production beast.

In practice, it's a tricky balance and a lot of exploration is something that not many teams can realistically achieve (while surviving). Great engineering is also about understanding these tradeoffs and navigating them -consciously-.