Showing 3 Result(s)
Copper

A Brief History of Copper

Note: This is a “reprint” of an article I wrote for the now-defunct The Cask.

The metal of metals, copper possesses several traits that have made it one of humanity’s most favored elements, despite lacking the panache of silver, gold, or platinum. For example, copper—colored red-orange in its purest state—is one of the four elemental metals, along with caesium, gold, and osmium, that’s not gray or silver. That said, there are many other significant reasons why humans have extracted copper from the earth for thousands of years. It’s malleable, highly recyclable, and an excellent conductor of electricity and heat. In short, copper is an ideal metal whose use has played a pivotal role in civilization’s rise.

Our ancestors realized copper’s benefits a long time ago; the metal’s been pulled from rock and crafted into items for at least 10,000 years. The earliest known example of copper use comes in the form of a pendant, dating to 8,700 B.C., that was unearthed in the area that’s now northern Iraq. Copper also found a home in ancient Egypt (where copper tubing was used to transport water inside the Temple of King Sa’Hu-Re), India (where it was used to make lamps), Zambia (where it was used to craft burial ornaments), and other cultures of antiquity.

Those ancient peoples faced challenges in removing copper from the earth. Initially, they chipped copper from the rock in which it was embedded and hammered it into a larger mass for use in tools and weapons, but they discovered that the metal was easily broken. Fortunately, they soon learned that smelting—the act of using heat to produce metal from its ore—was the superior copper-extraction method. The earliest known evidence of smelting—dating to roughly 5,000 B.C.—was found in Serbia, but the process also independently arose in several areas across the globe, including Central America, China, and West Africa.

Around 3000 B.C., copper was combined with tin to create one of the first super-strong engineering materials known to man: bronze. The continual refinement of the copper, both in its combination with other metals and the extraction process, transformed the world by improving tools, construction, and weaponry.

The Middle Ages and the Age of Discovery saw copper applied in new and exciting ways. Copper became a part of artistic expression in Renaissance canvases and sculptures and later in styles such as daguerreotype photography. Copper also played a large role in international relations and affairs. It lined the hulls of Christopher Columbus’s famous seafaring fleet to prevent the ships from sustaining damage from salt water and biological agents. And the Statue of Liberty, France’s beautiful gift to the fledgling United States, incorporates more than 200,000 pounds of copper (her green tint comes from years of oxidation).

Our favorite use of the metal, the stills at The Glenlivet distillery, are made of 100 percent copper.

Copper use was important in the past, but it’s even more essential in contemporary times. The element’s excellent electrical conductivity makes it a cost-efficient, go-to metal for the electronics industry, with items such as batteries, microwaves, motors, smartphones, and tablets using the metal. The Copper Development Association estimates that 65 percent of all unearthed copper is purchased by the electronics industry.

One of the most significant moments in copper history occurred in 1997, when technology giant IBM adopted oxidized-copper interconnects, replacing the aluminum standard. The result was faster, smaller, and thinner computers and gadgetry due to copper’s malleability and ability to conduct electricity with 40 percent less resistance than aluminum.

The construction industry has a great need for copper, too. Copper’s bacteriostatic properties make it an excellent material for water and heating systems as it prevents bacteria from reproducing. Yes, even in the age of the ultrasleek high-rise condo, simple copper is a crucial component.

Unfortunately, contemporary copper use has raised concerns about the metal’s remaining supply. The earth contains a vast amount of copper—some estimates place the total at 10 trillion tons—but limitations of excavation and mining technology, as well as the economics of unearthing the metal, cause some to worry about the concept of “peak copper,” a hypothetical time when we’ve reached maximum copper production and have to cope with a dwindling amount.

Peak copper critics argue that the 9 million tons of copper that are recycled per year should greatly reduce the fear of a diminishing supply; copper loses no quality during the recycling process, so it can be used over and over again. In fact, recycling copper consumes 85 percent less energy than pulling the metal from the earth, so it has a relatively small environmental impact. According to the International Copper Association, an incredible 75 percent of the copper produced since 1900 is still in use.

Still, both sides agree that developing nations such as China, Brazil, and India have dramatically increased copper demand and prices. According to Oracle Mining Corp., copper traded for less than a $1 per pound in 2002; it traded for nearly $4 per pound in 2014.

Copper’s high demand—one that’s certain to increase in coming years—solidifies the metal’s status as mankind’s favorite shiny element. Copper has been an essential part of our lives since the rise of civilization, as it helped us craft the tools that make life simpler, more beautiful, and at times, sadly, deadlier. So the next time a penny finds its way into your hand, take a moment to think about the greatness within it, a greatness that’s often a reflection of our time on this planet.

Image courtesy of Fort Myers Florida Weekly.

Arrested Development

“But I am still thirsty”

Frustration is the shriveled hand that quickly lowers our life-blinds and prevents us from enjoying the vistas. The feeling can touch any area of our existences, but it has a particularly wounding sting when it grips our careers. Although I constantly trumpet the idea of divorcing one’s self-worth from one’s job, I recognize that it can be a wickedly difficult course of action; we pour a significant amount of hours into our jobs, after all. I also recognize that I’m not immune to the struggle.

I entered the publishing business at age 30, a time in a writer/editor’s life when s/he is ascending the ladder of success. Not only was I new to the game, I stood on the absolute bottom rung of the ladder—I was an intern. As a result, I reached, stretched, leaped, and scrambled to ascend to a senior-level position and overcome that delayed start. The journey took longer than I’d imagined, and I sacrificed tears, time, sanity, and a relationship to get there. Some would question the journey’s validity if it brought so much strife, and it would be a fair critique. I performed actions that I’d never repeat or encourage others to take, but the many trials proved beneficial in the long run for one simple reason: I learned not to take any of this too seriously.

Still, there are moments when it’s difficult not to feel as though my career would be on another level if I’d pursued editorial during the normal window in a young professional’s life. The thoughts often creep to the forefront during meetings with people above my rank, but I try to drown out regret’s footsteps with happy reflections.

I grew up a poor kid from a single mother. She did a fine job of shielding me from just how poor we were, and put me on the path of learning and dreaming. In my younger days, I wrote before I consciously decided to become a professional writer. I slapped together awful poems. I created and penned an Uncanny X-Men parody that I sold to other 7th graders during lunchtime. I wrote a couple of hacky screenplays that, thankfully, remain on a USB drive for none to see. The projects were mainly a way for me to escape some of the stress that came with growing up in the crime-filled 1980s- and 1990s-era Coney Island.

I was never a troublemaker, but I got into the typical troubles that teens get into in a large, connected metropolis. I grew up in the projects, and ran with peddlers, but the only time steel touched my wrists was when I got caught hopping a train turnstile on West 8th St. I had goals, and knew that getting involved in negative activities would ruin my pursuit of attending CES, covering E3, visiting Japan, and sitting on a panel at a geek-related function to talk nerd stuff.

All of those dreams, and more, eventually became reality. This isn’t a self-high five moment. I simply highlight these accomplishments because those memories comfort me whenever I begin to kick myself. They also inspire me to chase more.

It would be easy to coast on those successes, but now is the time to focus on new goalposts, ones that will serve as the fuel that propels me through the second half of my existence. They represent not just my future, but my personal shift from career goals to life goals.

  • Continue expanding my economic ability to walk away from anything (AKA, “Fuck You Money”)
  • Help even more black people achieve their goals
  • Create a successful podcast
  • Travel more
  • Make art

They’re a mix of passion, financial, and “leaving a legacy” projects. And the best part about the plans? I’m starting them at exactly the right time in my life.

*Image courtesy of Chrysalis/EMI Records 

How to be an Expert

Tweaking my LinkedIn profile upped my expert status

When the word “expert” hits your ears, what images come to mind? A wizened man with salt-and-pepper hair in a tweed jacket? A scholarly woman sporting a bun and librarian glasses? How about a guy in a cheap sweater, Lucky jeans, and low-top suede Wallabees? If you didn’t imagine the last person, I don’t blame; I wouldn’t have envisioned him either—and that guy is me.

Recently, I decided to do my yearly LinkedIn profile update. It’s an annual task that I adopted after reading Jill Duffy’s “Get Organized: 5 Tips for Getting the Most from LinkedIn.” The helpful suggestions helped me tighten and strengthen my LinkedIn page, but there was one tip that I didn’t use until very recently that proved very valuable: think in keywords. Long story short, I tweaked my professional title into one that’s more SEO-friendly, so that it would catch the eye of people searching LinkedIn for, say, “tech editor.” And it’s worked!

High-profile news publications, freelance writers, college kids writing theses, and podcast hosts have asked me to drop knowledge in the last few weeks. It’s been an empowering experience. Although I’ve written about technology for a decade, I saw myself as an editor with valuable thoughts and analysis, but not necessarily an “expert.” That is until someone on the other end of the phone actually referred to me as such.

It legitimately surprised me. No false humbleness, here. It later dawned on me that by having so many friends and acquaintances working in the technology and/or video game fields, I’ve lost touch with the fact that not everyone knows—or cares to know—the PlayStation’s role in elevating video games a mainstream, billion dollar industry. Or the best Web hosting services for companies on a budget. That realization played an important role in how I view myself in terms of career goals. That realization also helped me identify the required steps that are needed to walk toward expertise.

  • You must have a high level of knowledge in a particular area
  • You must have a few years under your belt; people seek veterans for knowledge
  • You must have the ability to explain a topic in everything language to someone who’s unfamiliar with it

And that’s about it. I think. There’s a very good chance that I may have overlooked an essential tip, but I never claimed to be an expert about experts.

Image courtesy of ReliableSoft.