
Lately, I've been working on an album recorded with only real instruments, such as acoustic guitars, bass guitars, and drums with lead and back vocals.
For these types of productions, people usually record a demo track to guide them while recording the actual takes in a studio, and they start by recording the drums. This is an unwritten (maybe written?) rule in the recording industry that every professional knows.
The reason behind having a mediocre demo is to give the drummer some kind of feeling with the demo track. They would feel when to lower their velocity and when to hit their sticks harder. Otherwise, they would play to the click track without hearing and feeling anything about the song itself.
After recording the drums, some amateur musicians straightly record the other instruments on top of these recorded drums. The problem is that you would need to edit the drum tracks afterward. In this case, you're recording all the instruments with the wrong groove from unedited drum tracks.
This is why you don't usually record the other instruments on the same day. Other reasons include setting up a drum set and the microphones taking half a day! So you only have a couple of hours to record your drums on the same day, and you can finish them if you have a decent drummer.
For this matter, people first edit the drums and record the other instruments on top of the edited drums. This process usually goes by editing every instrument you just finished recording, so you can have a solid groove.
So, this is the rule.
Or, this was the rule.
Because nowadays, people don't record acoustic drums to their tracks that much –which I don't like– and they use digital plugins to play drums with MIDI. And people don't record demos as much as in the old days (sorry for talking like I'm 50 years old), usually thinking these plugins sound pretty enough. Considering that, otherwise, they would have to rent a studio and hire a drummer and would spend the whole day and an enormous amount of money just to record drums for a song. They are pretty correct.
But I don't like to listen to those sampled drum plugins because they don't sound like a real human being. They are just too perfect and repetitive. In my opinion, an average listener can tell the difference between sampled drums and a real drum player.
After listening to a couple of rough mixes of my client, I realized that all the instruments are real, but not the drums. They were using Logic Drummer, which can help create demos in the songwriting process. Still, I definitely don't like hearing it in the final mix. I can tell in 5 seconds if you're using Logic drummer in your song.
I loved the album and thought it could be much better with real drums, so I offered them to record real drums with a real drummer. As I work with one of the best remote session musicians in the internet world, I offered them a couple of drummers, and we decided to work with one.
The problem with this and many other productions nowadays is that drums are usually the last thing you record. It was also the same for that project. I thought it could be problematic if we didn't edit the guitars and all the other instruments beforehand.
So I also offered them to edit all the tracks. They accepted.
But I made a mistake.
I generally don't like editing, and I'm lazy even when editing my own songs. I started editing those tracks on Pro Tools by hand, but when I got bored with the process and realized that I would have to spend so much time editing, an idea came to my mind. One of my producer friends told me that the auto-quantize feature works very well on Logic Pro. He said he only touches some points to fix minor issues after auto-quantizing.
So I thought this could be a good idea.
I was wrong.
I ruined my client's performances, on which they worked hard, by quantizing them to the grid. Even though I spent time touching on some points, I stripped them off of their soul.
But I was proud that I finished all the editing of those tracks before recording sessions. Technology helped me again.
I was wrong.
But they realized I ruined their performances, so they genuinely told me that. It took me a while to accept that, but I got it after just turning off all the editing on the album's first track. Yes, there were some mistakes, but overall, it sounded much better than my edited version.
I thought about what could be the best for the album and how I could fix my mistake. Considering we had already recorded all the drum tracks on top of those auto-quantized instruments.
I realized that it wasn't a big problem in the end because the drummer was already playing to a click track, and maybe it was nice to have very tidy background music for him.
When I decided to do everything from zero point, I realized that WE DON'T SEE MUSIC. Right now, I'm thinking about making a poster out of this quote and hanging it on a wall, so I can always see it when I'm working. I'm thinking about this because we always forget about it. We believe we have to edit that section because we see some grids and lines on the screen, and we SEE that we made a mistake while playing that part.
The worst is we don't even listen to, hear, and see, but we automatically click the quantize button.
Which is wrong.
Because we don't effing see the music.
We listen to it.
When you publish it to the masses, the only thing that they will see is your artwork. Nothing more. They will listen to it like normal people.
So why do we look at those grids while editing our beloved tracks? There is no point in doing that. People didn't edit our favorite albums like that 50 years ago because they had to listen to the tracks with their ears. They had to have a real reason to edit those tracks with real scissors. They would record it from scratch, or they overdubbed those parts. But they judged themselves by listening, not seeing some dots and lines on a digital screen.

So, what's my conclusion from that story? How do I edit those tracks now? What did I learn?
Right now, I don't even edit the drum tracks because I realized that the drummer I'm working with is incredibly talented, and he's like a groove machine. He just doesn't make mistakes. But at the same time, he doesn't sound like a machine.
So I first dim the light on my screen. I hit play and look out of my window to see the beauty of those trees this autumn season. It's much better to look at those freaking lines on the screen. I hit play and judge the performance with my ears. Whenever I hear something I don't like, I stop playing it, increase the light of my screen, and only edit that part that bugs my ears.
I repeat that process with every instrument I add to the song. I increase their volume to hear the drum tracks clearly and I fix the sections only if I spot a mistake.
When editing, I don't use the smart grid option because I don't want to edit it to perfection (which doesn't exist in nature.)
Even lately, I learned something from Louis Bell about editing MIDI tracks. Usually, we record a real performance with our MIDI keyboards and hit the quantize button immediately, which is another way of killing performances. He suggests editing sections by selecting all the notes and changing their positions together. This way, the little timing differences between notes, when you're playing a chord, will stay there, so will the reality of the performance. Because we're not machines, we cannot hit every note at the exact same time.
Of course, there would be some genres when you should sound like a robot.
We will already have AI robots creating original (?) music in our lives in a few years. So what's the point of sounding like them?
The future is here, pushing everyone to be more original and creative to survive. The best way to do this is to be yourself and a human being. And humanity ain't perfect.
Contact me here to create original songs together.
Comments