Skip to content

Posts tagged ‘Office’

Brief thought exercise: how closely do you connect your self-worth to your job?

Been thinking about this for a few years now in various forms: essentially, how do you view work in the grand scheme of your overall life? We clearly spend a lot of time there, even if there’s no real science around it – by some measure it could be about 1/3 of your adult life (this can obviously vary by where you live). We know it’s been shown to be tied to feelings of self-worth and self-esteem, which is only logical because, well, you spend a lot of time there. But there are dangerous aspects to connecting too much of your self back to work: first of all, a majority of managers aren’t very good, so you might be putting your self-worth in the hands of someone that isn’t really equipped to be handling it, and the science/culture around performance reviews isn’t exactly stellar (neither is the role of truly “organic” feedback in the workplace).

You take all this together and it’s an interesting picture: if you wrap up a lot of your personal definition in your work, that’s obviously a high-risk, high-reward situation. You could get chopped down at certain levels, but you could also do well, advance, make more money, etc. If you don’t tie up your self-esteem with work, you might be somewhat detached at work — or seem less passionate to managers — which can lead to you squandering time and not being given extra responsibility. It’s kind of an odd circle that keeps flowing.

Personally, when I was younger, I used to think jobs were very important. If I saw others getting more opportunity than me, I’d be mad/sad/some other emotion I can’t define. When you see your friends getting married and getting plum job assignments and travel opportunities and you’re single and most cubicle-jockeying, it can be hard. I probably had a bad attitude around that time, too, which only worsens the problem in terms of manager perception / ability to rise up. As I got older, I tried to tie jobs cognitively to a sense of wanting to do well — achievement, in other words — but also realizing that there are a million and five side factors that go into advancement and all that, and that ultimately, at base level, work is pretty much a means to an end. Unless you do something you’re amazingly passionate about or something that helps to save/define other areas, it’s a job that you do, try to do well, and then go home to the rest of your life. Cliche alert: very few people, in their final days, wish for more work. They wish for more family time, or time with people they loved from work, but not often workdays themselves.

That part, I think, is important.

I just started a new gig and my goal is to be the best I can and excel there, but I also want to contextually remember that it’s a means to an end and it’s not my definition of self-worth.

What do you all think? Do you find yourselves tying up work and self-esteem?

Brief thought exercise: in reality, does your job need to exist?

To start, read this (from PBS), then this (from Slashdot). David Graeber teaches at the London School of Economics — so broadly, he’s vetted — but he’s also a leading thought person in the Occupy Movement — so theoretically, you could dismiss him as a crackpot. He wrote a piece last summer for a left-leaning British zine entitled “Bullshit Jobs.” It went fairly viral (translated into about 20 languages and re-shared) and while his primary academic focus seems to be debt and what to do about debt, he makes some interesting points regarding the creation of jobs in a knowledge economy.

Here’s the crux of his idea, from his interview with PBS:

I’d say 20 percent. But it’s hard for me to say. The last thing I want to do is come in and say, you, your job is BS, while you, you’re okay. The whole idea is that people should decide for themselves what’s valuable. But if you talk about jobs where the people who actually are working at them secretly feel that they really don’t produce anything, or don’t do anything, I’d say about 20 percent has been my experience. But, of course, you know, we’d have to do extensive research to see if that’s really true.

So, in an admittedly unscientific manner, he’s claiming that 1 in every 5 people has a job that essentially doesn’t add value or doesn’t need to exist. That could be drastic and it’s probably lower than that — actually, a better way to think about it is that everyone (people) is valuable, but oftentimes there will be 12-14 people doing the work of 8-10. This is fairly common in mid-size to large organizations, and massively common in large to very large organizations. Last summer I worked in one of the latter, and the floor I was on had about 118 people. I once had a mid-senior manager tell me, “I’m not entirely sure what 40 or so people on this floor do.” Sadly, I don’t think that’s horribly uncommon.

Work needs to shift in the coming generation, especially with the exiting of the Baby Boomers — see here and here, for example — but the big question is, Can it? Obviously I’m talking about U.S.-based work, as that’s what I know and understand. The context is different in Scandinavia and Asia, and I realize that. In going through the last couple of jobs I’ve had, though — both short-term and full-time — I realize that a lot of them, conventionally, would fall into that 20 percent bucket above. This rolls up with employee engagement figures, which indicate that (at some level) 77 percent of people are not fully engaged at their jobs.

I’m not sure work can change — it may be more of an issue of when an organization has to change that it does, and enough of them do that it becomes a “new norm” (probably several generations hence, if at all) — but I do think most organizations could do a bit of a critical re-evaluation of themselves at core. Primary management tenets are based on literature from the 1920s-1950s, and the world was a much different place then as compared to now. The approach needs to be updated; in some cases, it is.

I’ve wondered a little bit in my life about why “jobs” are even necessary at all; for example, couldn’t an organization just hire the 20 best people they find in a process, then trust them to learn the roles necessary for success? (So long as a leader and a plan is already in place.) I realize that sounds crazy, because humans need structure and some jobs need to be specialized — from an engineer (which doesn’t require licensure in some sense) to a doctor (which requires a ton of time and debt). But in a way, is the idea of specialization (which was one a bedrock of the emergent U.S. economy) now actually a detriment? (Remember: it’s a badge of honor for some organizations to be able to say “We’re hiring” or “We’re growing,” so oftentimes creating jobs is done as kind of a theoretical pissing contest, not necessarily aligned to any real need).

So look, take out your own views on the fundamental nature of your job. Maybe you love it, or love the field, or whatever it is. But does it really need to exist? Think about it. It’s kinda interesting.



Results-Only Work Environment, or ROWE, should be the wave of the future. The problem? Often, employees are treated like children.

There’s a long article on Slate from over the weekend — I think it’s one of their most popular and most shared, all that — about ROWE, or Results-Only Work Environment. Here’s the article. I’ve written about this kind of stuff a ton on this blog — see here and here, for example — so I won’t go super deep into everything (you can browse my other posts if you feel so inclined), but here’s a basic framework.

ROWE is based on the idea that a lot of old-school sales guys talk about: “My job is based on commission. If I don’t sell, I don’t eat.” This is based on the oldest-school motto of “eat what you kill,” which was quite literally how the world worked in its earliest incarnation. Under ROWE, you can choose to work from home whenever you want — with no vetting required. You can take 30 days of vacation if you want — with no vetting required. You can disappear for weeks at a time — and no one should be asking after you. All that matters is that you get your work done. There’s a series of metrics tied to your role — say, 10 percent sales growth or 50,000 new customers — and if you hit it, you get retained, paid, and possibly promoted. If you don’t hit it, there are consequences. Simple, right?

The obvious advantage here is that workplaces shouldn’t be standardized. In a 10K-employee office or a 200-employee office, not everyone is the same. A set of tasks takes Person A 45 hours to complete M-F; that same set of tasks may take Person B 32 hours to complete. Why, then, are Person A and Person B working the same schedule? And if they are, why isn’t Person B’s schedule designed around the idea of “20 percent time” or something similar? Since people aren’t the same, the sheer fact that hundreds (or thousands) of people can work essentially the same schedule is mind-boggling. If the goal is results (be that money, people enrolled, outreach, whatever) and you hit the results, who cares how long you’re there or where you’re sitting?

This goes to the second perceived advantage: theoretically, ROWE should reduce politics in an office. Seat time is massively important to some managers, almost irregardless of context. Example: I worked with a guy a few years ago who consistently got in at 10am or 10:30am (we all got in at 8:30-9am). Our main boss didn’t see him enter at 10:30am, but this guy also stayed often until 7pm (we would leave 5:30pm or so). Our boss did see him exit later. He got jumped into plum assignments all the time. It’s fine — he was good at his job — but it was a bit awkward because the rationale was always the exit time (or partially the exit time). He actually worked less hours than most of us, but he made it seem like he was a grinder. Under ROWE, all that should matter is results, plain and simple. Throw out the politics.

The third major pro here is that the world has changed, and so too has work. Most management philosophies are rooted in the 1950s (or even before!), but in the 1950s-1960s, we didn’t have shared software and e-mail and Skype and … well, hell, the list goes on for miles. The need for face-to-face is less. But here’s where the problems emerge.

The goal of a company/organization, in most cases, is to make money. When that goal isn’t being met, people tend to standardize back to the norm — “Maybe we shouldn’t try anything too crazy, because our competitors could get an advantage then!” — as opposed to innovating. You just saw this at Yahoo. Marissa Mayer came in with a huge amount of buzz/clout/context/reputation and shredded the work-from-home immediately. Best Buy was one of the companies using ROWE; when they hit a rough patch in 2008 and a bit beyond, they ultimately dropped ROWE (although via Slate, some arms of the company still use it covertly). When revenues are on the line, people want the management structure to be basic and comfortable, so often that becomes the norm (if it wasn’t the norm first).

That makes logical sense, but there’s one major problem therein: standard management is rooted in the idea, for better or worse, that if a person isn’t monitored actively, they will eff off and do something else (something more enjoyable or engaging) with their time. Sure, this happens, and it’s probably closer to the norm than not. But it creates a work culture — and again, this is broadly-speaking, whereas specific elements may be different — where adults are often treated like children (“You can’t work from home because the assumption is that you’d be off-task”). This was fine in an almost completely-male workforce of the 1940s/1950s, but now you have two people working, two parents working, child care costs rising, different schedules, etc. The context is different, so start treating employees like adults again. It could bolster engagement, which can bolster retention (who’s going to leave a job where they have a sweet deal that works for their spouse and their kids?), which can ultimately save costs.

And if you think letting adults have “free time” is doomed to fail, consider some Google initiatives — which have done just that and led to money-making products for the company.

There is an aspect of all this where anything different/new can be feared — but what if it doesn’t work? – and that hurts the adoption of things like ROWE. By and large, people can be brilliant in their own ways, but thinking that our most creative, forward-thinking people are in mid-to-large sized companies is probably a fool’s errand. Convincing others of change, be it a new diet or a new car or a new management structure, is one of the greatest, oft-studied challenges of human existence. So look, all this stuff is hard. But it can be simple: basically, take adults and treat them like adults. Assume they want to succeed, want to keep working there, want to keep earning money, and will push for results in order to do all those things. Don’t assume that if you let them have some time back, or work from another location, or whatever … that it automatically means they’re off-task. Think positively about your people, treat them as you wish you’d been treated on the way up the ladder, and see what happens. That’s how you start seeing more ROWE in the world.

Is entrepreneurship dying? And if so, is one reason (aside from the economy) the general distaste for bureaucracy in an entrepreneurial culture?

If you really think about, one of the most tangible selling points of America — one that you’ll hear bantered about quite often — is the idea of “entrepreneurial spirit.” This ties back to the American Dream; it’s the concept that, because of freedom (the central American tenet), people can go and do whatever they want, so long as they have some kind of plan + work ethic around it. This has been a factor in the rise of many major innovation centers, from Detroit (back when) to Silicon Valley (today). Now, though, there’s a new Brookings Institution study indicating that entrepreneurship has reached a three-decade lowRuh roh.

There’s no cause-and-effect relationships in the study mentioned, so people can fill that in themselves (dangerous). There are a couple of different charts to consider — aren’t there always? — but one semi-troubling one is this one:


Essentially, “firm exit” is now out-pacing “firm entry.” That essentially hadn’t happened since the late 1970s (the data set contained therein). Even if you think some of these results are skewed (and they may be), it’s hard to argue with this basic logic set: if more firms are exiting than entering, it’s probable that a smaller number of companies are being controlled by entrepreneurs.

A lot of this comes back to the idea of shying away from risk: the economy is better than in 2008, but it isn’t great. Jobs are hard to come by and the process of recruiting / hiring / developing is a bit off. Less people — not “nobody,” but certainly less — want to go all-in with their own money (often) or money from their family (also often) on a venture that, inherently, has a lot of risk.

There’s an interesting side corollary here, though: the idea that to truly be successful, an entrepreneur needs to embrace bureaucracy early on (i.e. Human Resources and other support services), which seems like the Antichrist to some entrepreneurs. Cue The New York Times:

The Stanford Project on Emerging Companies, a longitudinal study of 200 Silicon Valley start-ups during the first dot-com boom, found that tech entrepreneurs gave little thought to human resources. Nearly half of the companies left it up to employees to shape the culture and perform traditional human resource tasks. Only 6.6 percent had the type of formal personnel management seen at typical companies.

Bureaucratic H.R. is “loathed” by engineers because it adds costs and slows decision-making, the leaders of the study, James N. Baron and Michael T. Hannan, wrote in a paper in California Management Review.

Yet a human resource department is essential. The two found that companies with bureaucratic personnel departments were nearly 40 percent less likely to fail than the norm, and nearly 40 percent more likely to go public — data that would strike many Silicon Valley entrepreneurs as heresy.

“In the new economy, as in the old one, it turns out that organization building is not a secondary diversion from the ‘real’ work of launching a high-tech start-up,” they wrote. “It might well prove to be the main event.”

This is very interesting, because there are elements of Human Resources (and other support functions) that can be a train wreck, although I’d argue that a lot of that is rooted in historical context and not actualized fact. What I mean there is that oftentimes, people running a company in 2014 are in their 50s/60s — which means they came up working in an era when HR was “personnel” or “admin” or an abutment to the secretarial pool. There is still a big culture around HR as an office cop — they do reports on you and hire/fire you — but it can be made into a more proactive force in an office if you allow the people there to focus on (a) talent strategy (and really focus on it, not just do phone screens), (b) org development (culture of meetings, “fun” activities), (c) trainings (which are basically what separates a good company from a great one), and (d) data. Everyone loves Big Data, even though no one really knows what it is. HR already has the data on employees — from hire date to different reviews, etc. — so they should be a center of using that data for good (i.e. to move the company forward) as opposed to for stagnation (i.e. holding the data) or reversal (i.e. using the data to justify terminations and pay cuts).

That above paragraph is all an aside about how you can make HR seem less bureaucratic and more proactive, but these different studies, when taken together, have a lot of potential repercussions for the American economy. There’s long been a disconnect between how cities plan to attract entrepreneurs and what they actually want, but if their numbers truly are dropping, perhaps that matters less. We’re entering into an interesting time in the American workforce: as the Baby Boomers age out, we’re going to lose millions of jobs, and we may not have the people to replace them. If basic functions like the price of milk and meat and gas continue to rise (they logically will), people will feel pinched and the entrepreneurial spirit will continue to be in (general, not specific) decline. That means millennials will end up entering larger, more-established companies for work — which leads to interesting ideas about the potential demise of hierarchy (or not) – or will end up flooding “the leisure economy” (and probably making not a ton of money). In short, a worst-case scenario for the decline of entrepreneurship is (a) an even more shrinking middle class and (b) probably less cool beer bars being opened by disaffected hipsters.

It’s very easy to look at some of the above data/studies and say “Oh, that’s not right” or “Oh, America shall rise again, just like always!” Indeed, those things are probably true. But even if you think the idea of declining entrepreneurship is bollox — and maybe it is — it would be very hard to argue with the idea that the workforce is about to drastically shift (both in content and context) and the best companies will be the ones that are legitimately ready for that.

Brief thought exercise: why are people generally more receptive to “let’s hop on a call” or “let’s schedule a meeting” then a three-line e-mail that explains the situation?

I’ve wondered this constantly in different jobs I’ve had, and even aspects of the job search. You can write a pretty short, to-the-point e-mail that explains your situation or a project’s situation and one of the first series of responses you get is “Great, let’s schedule a meeting” or “Let’s schedule a phone call.” OK, but … the e-mail just talked about the status. Shouldn’t we wait to schedule the meeting until there’s something to discuss or consider?

I’ve always wondered why this happens. A couple of potential theories:

1. In the grand scheme of a work life, e-mail is relatively new (although a currency now). There’s a period of time where new things are still not completely trusted. Maybe e-mail is in that range.

2. People process e-mail very quickly, on the move or between other things, so perhaps sending out an opportunity to meet/discuss more makes them feel they’re really grounding the issue more.

3. Seeing people face-to-face to discuss an idea — or at least via phone — is very important to some, which is one of the primary arguments against the idea of “work from home.”

4. The true cultural currency of the modern workplace is how busy you are, so taking an update and turning it into a chance for another meeting only serves to make you that much busier.

The problem here is, meetings are the ultimate time suck, which can become the ultimate productivity suck. I wish there were more workplaces — and certainly more headhunters/recruiters — who were content to be OK with progress updates happening via e-mail, and bigger discussions happening in person/via Skype or phone.

If anyone has additional thoughts on this, feel free to leave them in the comments.

What if one weekly meeting took up 300,000 hours of manpower in a year? That’s the entire year of 34 people’s lives. But this stuff happens.

Take a deep breath and say it with me: not everything needs to be a meeting. Pause, and now say it loud and say it proud: some things can be an e-mail, a quick talk in the hallway, or a trip to the gym. Pause again, take a swig of that coconut water, and repeat: the meeting is not the be-all and the end-all.

I heard an interesting story on a job interview I did about a week ago. One of the guys I was interviewing with used to work for a pretty big company (insurance sector), and one thing they would do is go around to conference rooms and hang up signs of how much the meeting inside was costing — let’s say it’s a 1-hour meeting with 21 people. You basically take the hourly salaries of everyone in there and add ‘em up. For certain meetings, you’re strolling past and are like, “Jeebus, that’s a 12K meeting?!?!?” Now of course, it doesn’t actually work like that — it’s not that hard and fast, in other words — but it’s an interesting device to get people to contextually re-think the experience.

Hopefully this is too: via Harvard Business Review, here’s a weekly senior leadership meeting that ultimately ended up costing the company involved about 300K work hours (across all levels) in a year. 300K hours is the equivalent of the entire year for 34 different people. How did it work? Well, you can click the link to look at all the math, but essentially, senior staff needs to meet with unit chiefs (to get updates), unit chiefs need to meet with their direct reports, their direct reports need to have all-hands meetings, etc. In the end, it totals about 300K — again, the entire annual life of 34 humans — hours.

Because everything that makes a difference in business discussions is ultimately revenue-facing and has ROI, here’s the essential takeaway from the Harvard study:

How companies can use time effectively is just one piece of a larger and ultimately more important puzzle: how to increase the productivity of their people. Boosting human capital productivity (HCP), we have found, is a powerful and often-neglected pathway to better performance.

Our research quantifies what’s at stake. Using a decade’s worth of data for the S&P 500, we looked at revenue per employee, a crude but useful measure of HCP. Then we compared those figures with each company’s financial performance. Since revenue per employee varies widely among industries, we confined our comparisons to companies in the same business.

The results jumped out at us. The best companies — those in the top quartile of revenue per employee — did 30% better than their peers in return on invested capital, 40% better in operating margin, and 80% better in revenue growth. Those differences contributed to a whopping 180% differential in total shareholder return over the 10-year period.

This all rolls up with one of the central challenges of modern business. Because everything is so quarter-focused (i.e. every 90 days, and the returns you’re offering within that span), priorities are very tight. People thus tend to focus on things they can controlas opposed to things that might make a difference (do make a difference) but are more intangible. Examples: social media, hiring, and how to run meetings. One of my friends used to tell me that the most frustrating part of his day was that he felt like he didn’t actually do any work until 4pm — when the meetings were finally over. At that point, he felt drained (we all do) and put in maybe 2-3 hours on follow-up from the meetings before the same process began anew. Repeat five times and that’s a work week. That’s actually not how the human brain processes action at all.

Basically, it breaks down like this: spending 300K people hours on 42 or so weekly senior staff meetings is a total waste of time, and most people involved in those 300K hours would probably actively admit that. The problem is — it seems like the right thing to do. Senior staff has to talk, and has to know what’s going on around the business. Right? Right? So it’s a necessary evil, no? I mean, right? Well, yes and no. Senior staff does need to communicate — everyone needs to communicate — but there might be ways to cut that 300K figure down (and again, not every company is spending 300K hours on preparing for a once-weekly meeting). You can do simple things: have a shared Google Doc where, every Thursday, team members leave updates on their progress (both pro and con) on different deliverables. If the team lead needs clarification, he talks to the person/people whose stuff needs clarification. Cool. Right there, with one shared doc, you eliminated a bunch of meetings. That funnels time back to people where they can be productive on actual work (or, hell, they can schedule different meetings with other people).

Or, you could hold the same amount of meetings but simply hold them standing up … it traditionally reduces the time involved by 34 percent, but keeps the quality of decisions relatively intact.

The broader point here? Think about stuff like people, how people are spending their time, where people’s talents could best be utilized, etc. Those aren’t necessarily topics that relate directly to the bottom line and shareholder value, but over time they relate immensely to those topics, so some kind of best practices around them do need to be considered.

Brief thought exercise: where’s the line in a job interview between being casual/funny/personable and the definition of professionalism?

Feel like this has happened to me a couple of times in the last six-seven months: I’m at a job interview, and it’s one of those situations where you meet with 3-5 people in a given day, for about 30 mins-1 hour per. I actually somewhat like this format, although by the end you’re tired of talking about yourself, mostly because you get a good idea of some of the different types of mid-level-and-up personalities at the place. That can be helpful. But this situation that happens a lot is that 1-2 of these people will come up and instantly try to take it to a more casual place — kinda like two people grabbing a beer or a coffee, and not necessarily being on a job interview. This is more natural for me than a job interview (I’d assume that statement applies to close to 100 percent of the population, since I don’t think anyone regularly goes on job interviews to the point of utter comfort or loves them), so I’ll fall into the natural rhythms of that type of discussion. I may even lob a joke here or there. I feel like this makes sense — companies/organizations often say, on the record, that they don’t want robots, but rather want people with actual personality who will benefit the different teams intangibly as well. So I’ll do that — which is to say, be casual but keep it within the navigational beacons of professionalism — and then I won’t get the job. If I get feedback (let’s say 1/5 times), I’ll hear something like, “Well, so-and-so interviewer thought you were a little casual…” Hmmmm.

So if an interviewer goes down the casual road, is the best approach to not respond in kind? He/she goes casual; you stay professional and driven and accomplishments-reeling-off the whole time. Alright … that does seem like the safer plan. But then, if you didn’t get the job and got feedback, would the feedback be about being too robotic, or “not being the right fit?” (That’s a dreaded way to be evaluated, because it means everything and nothing all at once — and anything that means everything and nothing just sends your brain into hyper-drive about what it means. Brutal irony of life right there.)

To summarize: organizations often say they want “real” people — people with senses of humor, the ability to converse in different styles (again, logical; organizations often have many different types of clients/customers that an employee may interact with), and the ability to feel “real” or “organic.” Of course, organizations also want — and most have — professionalism in all aspects. So where’s the line when you attempt to enter an org? Do you play it all the way to one side? Do you try to figure out in five seconds if the specific interviewer would be good with a more casual approach? Do you just go off the stated culture and hoped the stated culture is what the interviewers actually care about?

Any thoughts, leave ‘em in the comments. I actually am curious.


Get every new post delivered to your Inbox.

Join 3,269 other followers

%d bloggers like this: