I agree with you and the article and further...
Many companies are trying to take search, for example, to the next level by brining predictability through your history as well as your graph. It takes decisionmaking out of people's minds.
"You show a history of reading this and that book, have watched these movies and your circle of meaningful people have influenced you thus, so we have this book on its way to you now."
It looks like it's really intelligent and offers convenience, time-savings, "efficiency". You don't even have to think of what you want, it knows already. You have an appointment in LA in 3hrs, don't forget your [whatever] and the mother of the person you are about to meet is in the hospital, take something appropriate for the occasion, we also suggest the following lamentation "...".
This kind of thing is taking the humanity out of being human. People are reduced to a basic animal framework and the machine is adding humanity for you, so you don't have to bother. In taking the "tedium" out of daily living, people are reduced to a sort of best on a pedestal.
You can look at it as removing our humanity, or you can look at it as supplementing our humanity.
Prior to the written word, we had to devote time and effort to passing down information through the spoken word. By offloading this task to an external device (wall, tablet, parchment, paper, book, screen), we've been able to use that time and brainpower to focus on other items, while also increasing the fidelity of our history and what we can learn from it. People lament the loss of an oral storytelling culture. It is a loss, but we've gained as well, so mathematics, literature, philosophy. We are able to build on the past in a way we weren't before.
Prior to the plow and domesticated animals to pull them, we used to have to plant manually or practice a more hunter & gatherer lifestyle. By offloading these tasks we are able to secure our future needs, and live a life less of subsistence, and more of security in our health and future. Some people lament the loss of connection to the land, the feeling of oneness with the seasons. It is a loss, but again, we've gained as well. This free time has allowed us to study what we've written, and build upon the past.
You could make a case that using these tools (and hundreds more in our history) takes the humanity out of you, and relies on the tool for it. I would make the case that using these tools is what makes you human, as without them, what's really the distinguishing feature that separates you from the apes? To me, nothing defines humanity more than the constant improvement of our race, our reach, and our capabilities.
That I carry around a small computer in my pocket that connects to a much larger network and allows me near instantaneous connection to my family, friends, coworkers, and the largest accumulation of knowledge the human race has ever seen might seem like I'm trading away what you think makes us human, but since I use this to keep in touch with friends, be more mindful of important things to them, learn about what I'm doing to be more safe, secure and healthy, and share with myriad sub-cultures, I strongly disagree.
I'm aware of all that. We have come a long way since being "savage". My issue is we're getting to the last mile problem where we begin to undermine what we are.
My point is that I don't think "what we are" is changed by the tools we use. So what if a tool suggests we take time out for our friend? It's still our choice. It's no different than your mother, father, significant other, or random person on the street offering a suggestion if you explained the situation to them. Ultimately it's your choice what to do.
If you're unhappy with how people treat some things as worthy of their attention and others as something they can delegate to others, that's not a technology problem, that's a problem you have with cultural norms. Culture is the original hijacker of our minds, and all we're seeing is people is people acting in a way that's slightly different than what you think is acceptable based on your culture, and fighting back against it. Not because it's inherently wrong, or worse, but because it's different, and culture resists large change.
>> My point is that I don't think "what we are" is changed by the tools we use
No way. Just look at language, a tool we use for communication. It changes us radically. Listen to the RadioLab episode on on : http://www.radiolab.org/story/91725-words/.
Another example is writing. Before widespread writing and printing, many people used the 'Mind Palace' to remember things (https://en.wikipedia.org/wiki/Method_of_loci), hijacking the place-cells in the hippocampus to vastly increase memory. With writing, that all evaporated, much to chagrin of the old monks that taught and used the technique. But then look at us now as a species.
Another example is money. The tool that is currency and a method of exchange has changed some people a lot. Greed is not a good thing. Most religions have bans on usury and intreset because of the warping effects it has on society. I may or may not agree with those bans, I mean, hey, I live better than any king from 200 years ago. Still, we are feeling those effects today in our politics after Citizens United. Money, the good tool that it is, does change people.
I'm sure others can think of many more examples. But your tools change you just as much as you use tools to change the world.
"What we are" is meant in the context of "To me, nothing defines humanity more than the constant improvement of our race, our reach, and our capabilities." which I outlined in my original comment. In that respect, the tools are just stepping stones to let us be us. Sure, we might have less specialized cells in the hippocampus because we aren't exercising a specific (conceptual) tool, but I don't see that as any different than peasants with muscles developed to make it easier to spend the day tilling the land, or a knight with muscles developed for fighting in armor, or a modern day basebal player, with eyes developed to track fast moving object and lots of high speed twitch muscle fiber.
These tools change our jobs and behavior, but they don't change our core physical being much in a way that persists (evolutionary aspects of course apply), much less change philosophically what makes us "human" (in the sense of our "humanity", not homo sapiens).
Another way to look at this, what is it to reduce someone's humanity?
Listen to that radiolab broadcast. I'm pretty sure that is the one that goes into detail about a man that grew up deaf (in El Salvador, I think) and without any access to sign language. The broadcast has a woman that went there as a grad student and, I think, ended up teaching the guy sign language. A translator for the guy describes when he finally 'got' language and from what I remember, it was life and soul changing for him. He mentions that his friends that were also without the tool of language would try to communicate through charades and it would take hours to 'talk'. I can't do this justice, but the tale is very powerful. I also can't verify that the linked broadcast is about what I just said fyi (in a coffee shop, no headphones, sorry)
Still, I disagree. The tools that we use shape our souls as much as our brains. It's that repetitive daily use that mostly does this. We come to see the world and then believe things in the way we experience it, and our tools heavily shape this view.
To reduce someone's humanity is an incredibly deep question and i don't have the space here, nor the access to beer, to do it justice. However, I like that you turned the question on it's head. Most of the time we ask what it would take to make something human. The way you pose this question: "what would it take to rob a person of their humanity?" is MUCH more interesting and very provocative.
I'll definitely do so. I don't recall off-hand if I've heard that one, but I've heard a lot of them (I've been an on-and-off listener for a decade).
> Still, I disagree. The tools that we use shape our souls as much as our brains. It's that repetitive daily use that mostly does this. We come to see the world and then believe things in the way we experience it, and our tools heavily shape this view.
I don't disagree with your assertion, I just do't think the way "see the world and then believe things in the way we experience it" is actually humanity.
> To reduce someone's humanity is an incredibly deep question and i don't have the space here, nor the access to beer, to do it justice.
It is, but my use here is fairly simplistic. Basically, I think most people use "humanity" as a stand in for the values of the day. Obviously, if something is so affected by the current culture and attitude, it's can't be the defining trait of our race, can it? Thus, most of my comments here have been along the lines of both calling out what I see as a bogus definition, and proposing my own one, base don what I see as defining traits of our race. Innovation, advancement, etc.
> "what would it take to rob a person of their humanity?"
It really is a big question, but this bit by Hannah Arendt in "The Origins of Totalitarianism" (a book I highly recommend, it's as timely as ever sadly) really struck me.. it's on page 667 in the German version, this is my crappy translation:
> Humans, in so far as they are more than a completion of functions able to react, whose lowest and therefore most central are the purely animal like reactions, are simply superfluous for totalitarian systems. Their goal is not to erect a despotic regime over humans, but a system by which humans are made superfluous. Total power can only be achieved and guaranteed when nothing else matters except the absolutely controllable willingness to react, marionettes robbed of all spontaneity. Humans, precisely because they are so powerful, can only be completely controlled when they have become examples of the animal like species human.
So I would say at least part of the answer might be: taking away the ability to act instead of just react, and the the ability to start a logical chain of thinking from new premises (which is also something she mentions, though of course in contrast with totalitarianism, which forces a certain flow of logic based on some premises set in stone; she's not writing about what it means to be human).
Great question, and I won't do it justice in this attempt. I think at the point where people are more or less extricated from decision-making, small they may be. When people are essentially just organs plugged into a network, like cows to a milking station.
All you have to do is be, you no longer have to think, slowly things are done for you and decided for you -- with best intentions, of course.
See, I don't think we are. Some decisions are being automated, but that's not reducing our own choices, just freeing us to focus on other ones. It becomes less of "should I go and then what flower should I buy" and more just "should I go" and we've freed our time and attention for some other decision. Then again, there will always be those that go above and beyond for someone they care for. Maybe you remember that your mother particularly liked the flowers at the house you lived in when you were younger, so you track down the current owners and ask for a favor, whether you can have a few flowers from the front yard. Letting your mother know of the source might yield a distinctly different impression from the present. There's still room to provide attention where it matters, apps and reminders aren't taking that away.
If you look beyond reminder apps, I think things look a little different. There are good aspects to automation and some shortcomings as well.
I'm not saying people need to maintain survivalist skills. But I think there is a danger in relinquishing your will and control to data. How will people, in the future, be able to adjust to catastrophe? I have no pocket computer, what should I do? Can I eat now? What can I eat? It's so bewildering...
It's akin to an overly protective parents who one day has to face the world on their own, it can take years for them to recalibrate and readjust to their new reality.
I think that ignores that there is always a spectrum of people who want differing levels of control. Even if there apps to suggest all our actions, there will be people that use them sparingly or not at all, because they want to feel in control. This is acceptable and normal, and also leads to situations where some people do maintain survivalist skills. Our population is homogeneous in almost nothing, which is one of our strengths.
What happens in a catastrophe? People probably die. Depending on the scale, possibly a lot. Will everyone die? Probably not, but there are cases where it could happen. The ways to mitigate that have nothing to do with less automation of simple decisions in my mind, and possibly quite the opposite.
Well, like I said earlier it's not about automating simple tasks and giving us reminders, etc. --but rather, ultimately we'll take people out of the equation in most matters of import.
We'll be relegated to a state where there are a few "important people" and the rest are basically just vegged out (with a few pockets of 'natural people' here and there). People will not even notice this happen as they will slide into this state willingly and happily. Just as we slid in to a sugary diet without complaint.
It's hard to make the case for "inconvenience" in life, it's harder, it takes more energy, it's not efficient, and so on. But I think if we are to remain a useful species (not just a few useful people), we have to contrary to our inclination.
If everyone has apps that remind them to take time for a friend automatically (as it determined they needed it), the intention loses some of its meaning. That friend wont know if it was you that remembered because you cared and took notice, but simply the app reminded you. It's about the motivation behind your actions, its what can give a deeper meaning to them. The fact that you or another friend cared or noticed enough and remembered, gives it significance.
With too much automation of our social lives and relationships, many actions may simply become meaningless. Maybe they would even be done away with after a while, making human interaction colder, less personal. My app will talk with your app and setup a time to do something we enjoy, without either of us thinking of it ourselves, or discussing it. We would just have to show up, as directed by our apps. These apps remind us what we should discuss or mention, given our friends likes/dislikes. Sure we might have a great time, but the app might have prioritised meeting this friend over another, as your more compatible (measured via some series of metrics). You rely on the algorithm. Your not actually thinking about who you want to spend time with and why. Would you be able to make friends without the app?
That life. As in, that's life as it has always been. There's always been people that remind others, and there are always those that remember things easier than others. What difference is there whether it's an application or your mother reminding you? You still need to make a choice whether it's worth it to you,the app isn't forcing you to do anything. It can alter the influence we apply to certain actions when deciding how much someone cares, but it doesn't take away our ability to show we care. So what if more people will remember birthdays or illnesses, the people that really care will make an effort to show that in some manner. If just remembering is no longer a bit deal, they'll do something else.
> With too much automation of our social lives and relationships, many actions may simply become meaningless.
So we stop doing those actions. We'll find new actions, or reinterpret old ones to mean something more.
> Your not actually thinking about who you want to spend time with and why.
Then you must not care about the people involved. You devote time to what you care about. Reminders aren't going to change that.
These reminders are really just alleviating cultural busywork, which is really only needed because these cultural norms developed when we lived in much smaller communities. Rememdering birthdays and meeting in celebration of all your friends was much easier when we lived in villages of 100 people. Now we have cultural baggage from different economic developmental stages, and it creates a lot of extra work just to do what's culturally expected, as it doesn't fit the environment so many of us find ourselves in. In the end, that's just busy-work, we'll find ways to make those that matter to us know it.
So you don't think "what we are" is changed by the tools we use but it is changed by changes in cultural norms? Which are clearly influenced by the tools we use?
No, I think neither the tools we use nor our cultural norms really define what we are, just what we do, philosophically speaking. I think we're a a race of culture creating tool users that strive to advance. The tools we currently use and the cultures we find ourselves currently creating at most stages in a progression, and possibly not even that, but just fleeting memes in the story of our species (if we don't dead-end ourselves).
Ah! That makes sense. Didn't see you were making a what we are/what we do distinction. Philosophically I think that distinction is faulty, same for attributing a direction to advancement/progress/evolution, so I had trouble parsing what you meant.
Whilst i somewhat agree with the sentiment, the difference now is in the amount of efficiencies or benefits gained. The gains/benefits from tools, farming, writing etc, are quite large. The gains from an app suggesting a movie or book, or reminding you to do various things in your life, are quite small in comparison. It might be merely a matter of a few minutes.
It begs the question, is it worth gaining a few minutes here and there each day for the price of dependence on apps and a degree of disconnection from what makes us human?
That assumes the suggestion is directly comparable to an action you would have chosen yourself. What about suggestions for things where you would have taken no action, or suggestions to not take an action you would have taken? What about suggestions that include information you would not have been likely to come across yourself?
Imagine your friend is in the hospital, but you've been busy, and haven't checked your normal sources of information about your friends for a few days. You get a suggestion to send them best wishes or flowers, because they are in the hospital. This both suggests a course of action you wouldn't have considered, because you didn't know it was a candidate as well as imparts useful high priority information that you want to know that was hidden behind a feed you classified as low priority lately.
Sure, many suggestions may be minimal and not help much, but do we always classify systems based only on their average use? Sometimes it's important to look at the distribution of use, and whether there are other positive (or negative) externalities.
Currently, if you didn't know about a circumstance, people forgive you and generally understand. But, if you took the time to find out this information personally, this would have meaning, showing that you cared enough to spend the time to check. When you send flowers, the act can often have much more meaning. However, if everyone has apps that remind them of such things, you would no longer appear thoughtful, there is no longer as much meaning. We are already seeing this to a certain extent with social media.
This is the point. Is it worth the small gain in time, for the loss of some aspects of meaningful human connection/interaction?
"Meaning" always exists relative to a background. If the average person doesn't remember but you do, you stand out, and your actions have extra meaning. Just because the background decreases doesn't mean you can't stand out. You can always go beyond what the app suggests; for example, in additional to sending flower you could appear in person and sing to them.
Except it's dubious whether that device meaningfully connects you to others, or distracts you from making real personal connections with the people around you through constant distractions and intermittent rewards.
This is exactly the problem. Supplementing x implies complete knowledge of what ideal x is. And what the "ideal" is nearly impossible to assess, since it require complete knowledge of future. I mean, what form of a "human being" will the most optimal through all of human existence? When you supplement without knowing the ideal, it has more chance to do harm than good.
>using these tools is what makes you human, as without them, what's really the distinguishing feature that separates you from the apes?..
Ok. But is it the only thing? What about creativity? We have language. We can create great stories and beautiful poems in them. We can look into the secrets on nature and create immensely powerful tools with that knowledge. Isn't that more of a hallmark of being a human, than mere dependency on our tools?
And is there no limit on how much we are dependent on them. Does it make sense to trade of our innate capabilities in the long term, for minor conveniences in the short?
I had a friend who could easily navigate any complex routes and had all the local routes and short cuts in his head. I respected him for that. Now he cannot find way around a supermarket without GPS. A part of him, that once I respected as a human being, is gone now.
>To me, nothing defines humanity more than the constant improvement of our race, our reach, and our capabilities.
You think our race is improving? Why? Because we have smartphones and have a massive and collective addiction to it?
>and the largest accumulation of knowledge the human race has ever seen..
It is also one of the biggest Ad/propaganda delivery channels.
Easy access to information does not make a difference is the people does not have the drive to consume it. If you give internet to 10 persons 9.5 of them will use it for social media and porn. How many of the addictive smartphone users have you seen consuming wikipedia? I have personally NEVER, not even once, seen someone reading wikipedia on a smartphone.
> by brining predictability through your history as well as your graph
I wish they were better at it. For 15 years now, people have promised that relevant ads will be useful rather than annoying. So far, it seems like the pinnacle of that is to show me the last pair of shoes I looked at on Zappos every where I go.
Ad-tech seems laughably bad. Now I just run with uBlock Origin. Maybe I'll block ads for the next 15 years and see how things are looking in 2030.
Not necessarily Google, but yes some amalgamation of technology.
Your senses are staggeringly limited and your brain is a mess when it comes to determining what is "real" or not.
We use technology to fill the gaps. That's what technology is for - to allow us "see", interact with and influence the world through discrete and explicit measurement in ways our biological systems can't.
How is that not the dream? Augmenting and eventually replacing our biological capabilities with robust, high precision systems is the ultimate goal.
> Your senses are staggeringly limited ... when it comes to determining what is "real" or not.We use technology to fill the gaps. That's what technology is for - to allow us "see", interact with and influence the world through discrete and explicit measurement in ways our biological systems can't.
Sense perception to understand reality or to seek pleasure ? When we pick the phone to play a game, is that interacting with reality or with pure imagination. Same with the round about action of liking each others or seeing what new experience you posted etc.
Sense perception can be used for survival and pleasure. In the game of evolution the latter will get phase out and former will evolve. However in short term, where there is a great economic incentive to game the "pleasure seeking senses" for creating wants and consumption, there would be short term pain and confusion in masses. The question to ask ourself is that whether i succumb to this enticement or just move and let it meet its eventual end.
Country, companies are higher order concepts that lasts over generation. This lets them experiment over many people life and adjust accordingly. But we as an individual we have limited time and space constraint. Its only our mind that can help distinguish the chaff from grain and live a meaningful life.
> Augmenting and eventually replacing our biological capabilities with robust, high precision systems is the ultimate goal.
This was also true when wheel was invented. The purpose of technology is to help in getting things done. The pleasure seeking, sensory perception is just a distraction. And for many of us, the ultimate goal is to live a better life by reducing these distractions.
Your point jumps over my broader argument to address the original thrust of the article however. In that context we aren't in disagreement on this point.
I was however addressing the question of "should technology guide our existence" - and I think you agree with my emphatic response of yes but don't realize it.
And for many of us, the ultimate goal is to live a better life by reducing these distractions.
My guess is you are compartmentalizing things like social networks etc... into those "distractions." I might also, but would also place in there "hunger", "pain", "confusion" etc... the things that distract us from understanding the fundamental nature of the universe.
Our goals likely align - but on different points along the scale of time.
"You show a history of reading this and that book, have watched these movies and your circle of meaningful people have influenced you thus, so we have this book on its way to you now."
It looks like it's really intelligent and offers convenience, time-savings, "efficiency". You don't even have to think of what you want, it knows already. You have an appointment in LA in 3hrs, don't forget your [whatever] and the mother of the person you are about to meet is in the hospital, take something appropriate for the occasion, we also suggest the following lamentation "...".
This kind of thing is taking the humanity out of being human. People are reduced to a basic animal framework and the machine is adding humanity for you, so you don't have to bother. In taking the "tedium" out of daily living, people are reduced to a sort of best on a pedestal.