Apparently every growing part of every plant is continually circumnutating, though often on a small scale. Even the stems of seedlings before they have broken through the ground, as well as their buried radicles, circumnutate, as far as the pressure of the surrounding earth permits. In this universally present movement we have the basis or groundwork for the acquirement, according to the requirements of the plant, of the most diversified movements. Thus, the great sweeps made by the stems of twining plants, and by the tendrils of other climbers, result from a mere increase in the amplitude of the ordinary movement of circumnutation. The position which young leaves and other organs ultimately assume is acquired by the circumnutating movement being increased in some one direction. The leaves of various plants are said to sleep at night, and it will be seen that their blades then assume a vertical position through modified circumnutation, in order to protect their upper surfaces from being chilled through radiation. The movements of various organs to the light, which are so general throughout the vegetable kingdom, and occasionally from the light, or transversely with respect to it, are all modified forms of circumnutation; as again are the equally prevalent movements of stems, etc., towards the zenith, and of roots towards the centre of the earth. In accordance with these conclusions, a considerable difficulty in the way of evolution is in part removed, for it might have been asked, how did all these diversified movements for the most different purposes first arise? As the case stands, we know that there is always movement in progress, and its amplitude, or direction, or both, have only to be modified for the good of the plant in relation with internal or external stimuli.
Le piante sono esseri intelligenti? Partendo da questa semplice domanda Stefano Mancuso e Alessandra Viola conducono il lettore in un inconsueto e affascinante viaggio intorno al mondo vegetale. In generale, le piante potrebbero benissimo vivere senza di noi. Noi invece senza di loro ci estingueremmo in breve tempo. Eppure persino nella nostra lingua, e in quasi tutte le altre, espressioni come “vegetare” o “essere un vegetale” sono passate a indicare condizioni di vita ridotte ai minimi termini. “Vegetale a chi?”… Se le piante potessero parlare, forse sarebbe questa una delle prime domande che ci farebbero.
Pretty much anywhere in the world men tend to think that they are much smarter than women. Yet arrogance and overconfidence are inversely related to leadership talent — the ability to build and maintain high-performing teams, and to inspire followers to set aside their selfish agendas in order to work for the common interest of the group. Indeed, whether in sports, politics or business, the best leaders are usually humble — and whether through nature or nurture, humility is a much more common feature in women than men. For example, women outperform men on emotional intelligence, which is a strong driver of modest behaviors. Furthermore, a quantitative review of gender differences in personality involving more than 23,000 participants in 26 cultures indicated that women are more sensitive, considerate, and humble than men, which is arguably one of the least counter-intuitive findings in the social sciences. An even clearer picture emerges when one examines the dark side of personality: for instance, our normative data, which includes thousands of managers from across all industry sectors and 40 countries, shows that men are consistently more arrogant, manipulative and risk-prone than women.
The paradoxical implication is that the same psychological characteristics that enable male managers to rise to the top of the corporate or political ladder are actually responsible for their downfall. In other words, what it takes to get the job is not just different from, but also the reverse of, what it takes to do the job well. As a result, too many incompetent people are promoted to management jobs, and promoted over more competent people.
- Insist on doing everything through “channels.” Never permit short-cuts to be taken in order to expedite decisions.
- Make “speeches.” Talk as frequently as possible and at great length. Illustrate your “points” by long anecdotes and accounts of personal experiences.
- When possible, refer all matters to committees, for “further study and consideration.” Attempt to make the committee as large as possible — never less than five.
- Bring up irrelevant issues as frequently as possible.
- Haggle over precise wordings of communications, minutes, resolutions.
- Refer back to matters decided upon at the last meeting and attempt to re-open the question of the advisability of that decision.
- Be “reasonable” and urge your fellow-conferees to be “reasonable”and avoid haste which might result in embarrassments or difficulties later on.
- In making work assignments, always sign out the unimportant jobs first. See that important jobs are assigned to inefficient workers.
- Insist on perfect work in relatively unimportant products; send back for refinishing those which have the least flaw.
- To lower morale and with it, production, be pleasant to inefficient workers; give them undeserved promotions.
- Hold conferences when there is more critical work to be done.
- Multiply the procedures and clearances involved in issuing instructions, pay checks, and so on. See that three people have to approve everything where one would do.
- Work slowly.
- Contrive as many interruptions to your work as you can.
- Do your work poorly and blame it on bad tools, machinery, or equipment. Complain that these things are preventing you from doing your job right.
- Never pass on your skill and experience to a new or less skillful worker.
The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.
The more we learn about the world, and the deeper our learning, the more conscious, clear, and well-defined will be our knowledge of what we do not know, our knowledge of our ignorance. The main source of our ignorance lies in the fact that our knowledge can only be finite, while our ignorance must necessarily be infinite.
La méditation est la dissolution de la pensée dans la conscience éternelle ou de la conscience pure, sans objectivation, sachant sans y penser, et la fusion de la finitude dans l’infini.
- Too many managers want to learn “how” in terms of detailed practices and techniques, rather than “why” in terms of philosophy and general guidance for action.
- Learning is best done by trying a lot things, learning from what works and what does not, thinking about what was learned, and trying again.
- Without taking some action, learning is more difficult and less efficient because it is not grounded in real experience.
- In building a culture of action, one of the most critical elements is what happens when things go wrong. Even well planned actions can go wrong.
- Fear in organizations causes all kinds of problems. People will not try something new if the reward is likely to be a career disaster.
The authors found research from the field of psychology that conceptualises the types of knowledge or memory needed:
- declarative knowledge, ie explicit knowledge, knowledge you can state
- procedural knowledge, ie tacit knowledge you know how to do something but cannot readily articulate this knowledge
Another classification identified by the authors is organisational knowledge:
- formal codified knowledge, such as data and written procedures
- informal knowledge, such as that embedded in systems and procedures
- tacit knowledge arising from the capabilities of people
- cultural knowledge relating to customs, values and relationships
To maintain and strengthen the country’s manufacturing base and to ensure its remaining competitive surely deserves high priority. But this means accepting that manual labour in making and moving things is rapidly becoming a liability rather than an asset. Knowledge has become the key economic resource and the dominant–and perhaps even the only–source of competitive advantage. Creating traditional manufacturing jobs – as the Americans, the British and the Europeans are doing – is, at best, a short-lived expedient. It may actually make things worse. The only long term policy which promises success is for developed countries to convert manufacturing from being labor-based into being knowledge-based.
Equally important is a related insight of the last few years: knowledge workers and service workers learn most when they teach. The best way to improve a star salesperson’s productivity is to ask her to present “the secret of my success” at the company sales convention. The best way for the surgeon to improve his performance is to give a talk about it at the county medical society. We often hear it said that in the information age, every enterprise has to become a learning institution. It must become a teaching institution as well.
Moving beyond the rather limited and abstract concepts of competency and underpinning knowledge into a more content-rich zone, we can actually identify five distinct types of crime prevention knowledge:
(1) Know-about — knowledge about crime problems and their costs and wider consequences for victims and society, offenders’ modus operandi, legal definitions of offences, patterns and trends in criminality, empirical risk and protective factors and theories of causation.
(2) Know-what — knowledge of which causes of crime are manipulable — what preventive methods work, against what crime problem, in what context, by what intervention mechanism/s, with what side-effects and what cost-effectiveness, for whose cost and benefit.
(3) Know-how — knowledge and skills (competencies) of implementation and other practical processes, operation of equipment, extent and limits of legal powers, instruments and duties to intervene, research, measurement and evaluation methodologies.
(4) Know-who — knowledge of contacts for ideas, advice, potential collaborators and partners, service providers, suppliers of funds, equipment and other specific resources, and wider support.
(5) Know-why — knowledge of the symbolic, emotional, ethical, cultural and value-laden meanings of crime and preventive action.
Doing practical, operational crime prevention involves gaining, and applying, all five Ks. But know-how, and in particular its process aspect, brings it all together.
As for me, all I know is that I know nothing.
Britain seems in a fairly odd place psychologically right now. The denial and shock are subsiding but many of my friends and colleagues are still angry. They’re frustrated by a political system that they feel allowed disingenuous appeals to populist fears. Some are angry at the electorate themselves, particularly those that voted to leave and now say they wish they hadn’t, because they didn’t think their vote mattered or they didn’t understand the consequences. For some of us, the existence of #regexit hurts the most. We honestly thought we were smarter than that.
Despite all the breakthroughs made in science over the last centuries, there are still lots of deep mysteries waiting out there for us to solve. Things we don’t know. The knowledge of what we are ignorant of seems to expand faster than our catalogue of breakthroughs. The known unknowns outstrip the known knowns. And it is those unknowns that drive science. A scientist is more interested in the things he or she can’t understand than in telling all the stories we already know haw to narrate. Science is a living, breathing subject because of all those questions we can’t answer.
L’intelligence et la capacité de l’intellect sont deux choses entièrement différentes.
L’intellect est la capacité de discerner, de raisonner, d’imaginer, de créer des illusions, de penser clairement et aussi de penser de manière non-objective, personnelle. On considère généralement que l’intellect est différent de l’émotion, mais nous utilisons le mot intellect pour exprimer la totalité de la capacité humaine de penser. La pensée est la réaction de la mémoire accumulée au cours de diverses expériences, réelles ou imaginaires, qui sont emmagasinées dans le cerveau sous la forme de savoir. Donc la capacité de l’intellect est de penser. La pensée est limitée en toutes circonstances et lorsque l’intellect régente nos activités, dans le monde extérieur comme dans le monde intérieur, nos actions sont forcément partielles, incomplètes, d’où le regret, l’anxiété et la souffrance.
L’intelligence est la capacité de percevoir la totalité. Elle est incapable de séparer les uns des autres les sentiments, les émotions et l’intellect. Pour elle, c’est un mouvement unitaire. Comme sa perception est toujours globale, elle est incapable de séparer l’homme de l’homme ou de dresser l’homme contre la nature. L’intelligence étant de par sa nature même la totalité, elle est incapable de tuer… Si ne pas tuer est un concept, un idéal, ce n’est pas l’intelligence. Lorsque, dans notre vie quotidienne, l’intelligence est active, elle nous dira quand il faut coopérer et quand il ne le faut pas. La nature même de l’intelligence est la sensibilité et cette sensibilité, c’est l’amour.
Now, one of the problems was in those days was that there wasn’t a computer sciences curriculum, or an emphasis in computer sciences like there is today. So when we would recruit, we might get a mathematics major or an economics major or a music major, and we would then bring them into training programs all summer long, intensively teaching them computers and programming and so on. We would really inculcate in them not only the ideas of computing and so forth, but the ideals that Cincom stood for, with the idea that we would be doing for them what IBM did for me — which is make a large scale investment in them, teach them, train them, develop them in the computer field. We hoped that many of those people would believe in our company over a long period of time so that as they grew out of their twenties we would have an active, energetic, well-trained group of people who had eight, ten, or twelve years’ experience with Cincom while still in their early thirties. This worked out very, very well for us. So we rapidly went to an environment from hiring experienced, trained, seasoned vets to investing in good people for the future. We had a formal education training program that was really terrific, probably was the best training program developed ever in the software industry, and today may still be the best.
As a practice, software development is far more creative than algorithmic.
The developer stands before her source code editor in the same way the author confronts the blank page. There’s an idea for what is to be created, and the (daunting) knowledge that there are a billion possible ways to go about it. To proceed, each relies on one part training to three parts creative intuition. They may also share a healthy impatience for the ways things “have always been done” and a generative desire to break conventions. When the module is finished or the pages complete, their quality is judged against many of the same standards: elegance, concision, cohesion; the discovery of symmetries where none were seen to exist. Yes, even beauty.
These conflicts are extreme because so many cherished notions about our origins have been overturned by the Out of Africa theory. Our book will show why its basic tenets are correct, nevertheless, and will demonstrate that humankind’s common and recent ancestry has great importance, for it implies that all human beings must be very closely related to each other (as is also demonstrated by genetic studies). Human differences are mostly superficial, changes which took place in the blinking of an eye in terms of our whole evolutionary history. We may look dissimilar, but we should not be deceived by the stout build of the Eskimo, or the lanky physique of many Africans. What unites us is far more significant than what divides us. Our variable forms mask an essential truth–that under our skins, we are all Africans, the metaphorical sons and daughters of the man from Kibish.
Science is a very human form of knowledge. We are always at the brink of the known, we always feel forward for what is to be hoped. Every judgement in science stands on the edge of error, and is personal. Science is a tribute to what we can know, although we are fallible.
Societies dominated by print media regard only printed knowledge as essentially valid. Textbook publishers exert a huge influence on education at all levels while schools and universities refuse to accept knowledge in other than printed forms. The monopoly of knowledge protects its own with wary vigilance.
Much has been written on the developments leading to writing and on its significance to the history of civilization, but in the main studies have been restricted to narrow fields or to broad generalizations. Becker has stated that the art of writing provided man with a transpersonal memory. Men were given an artificially extended and verifiable memory of objects and events not present to sight or recollection. Individuals applied their minds to symbols rather than things and went beyond the world of concrete experience into the world of conceptual relations created within an enlarged time and space universe. The time world was extended beyond the range of remembered things and the space world beyond the range of known places. Writing enormously enhanced a capacity for abstract thinking which had been evident in the growth of language in the oral tradition. Names in themselves were abstractions. Man’s activities and powers were roughly extended in proportion to the increased use and perfection of written records. The old magic was transformed into a new and more potent record of the written word. Priests and scribes interpreted a slowly changing tradition and provided a justification for established authority. An extended social structure strengthened the position of an individual leader with military power who gave orders to agents who received and executed them. The sword and pen worked together. Power was increased by concentration in a few hands, specialization of function was enforced, and scribes with leisure to keep and study records contributed to the advancement of knowledge and thought. The written record signed, sealed, and swiftly transmitted was essential to military power and the extension of government. Small communities were written into large states and states were consolidated into empire. The monarchies of Egypt and Persia, the Roman empire, and the city-states were essentially products of writing. Extension of activities in more densely populated regions created the need for written records which in turn supported further extension of activities. Instability of political structures and conflict followed concentration and extension of power. A common ideal image of words spoken beyond the range of personal experience was imposed on dispersed communities and accepted by them. It has been claimed that an extended social structure was not only held together by increasing numbers of written records but also equipped with an increased capacity to change ways of living. Following the invention of writing, the special form of heightened language, characteristic of the oral tradition and a collective society, gave way to private writing. Records and messages displaced the collective memory. Poetry was written and detached from the collective festival. Writing made the mythical and historical past, the familiar and the alien creation available for appraisal. The idea of things became differentiated from things and the dualism demanded thought and reconciliation. Life was contrasted with the eternal universe and attempts were made to reconcile the individual with the universal spirit. The generalizations which we have just noted must be modified in relation to particular empires. Graham Wallas has reminded us that writing as compared with speaking involves an impression at the second remove and reading an impression at the third remove. The voice of a second-rate person is more impressive than the published opinion of superior ability.
And the Pentagon has commissioned military contractors to develop a highly classified replica of the Internet of the future. The goal is to simulate what it would take for adversaries to shut down the country’s power stations, telecommunications and aviation systems, or freeze the financial markets — in an effort to build better defenses against such attacks, as well as a new generation of online weapons.
Just as the invention of the atomic bomb changed warfare and deterrence 64 years ago, a new international race has begun to develop cyberweapons and systems to protect against them.
Social media shifts authority and influence from traditional mainstream voices (e.g., institutional chiefs, professionals, pundits, critics, fashion editors, and other tastemakers) to respected online voices, and eventually to people conversing and sharing their opinions with one another. The availability, transparency, and accessibility of knowledge gained online has broken, in Harold Innis’s apt phrase, ”the monopolies of knowledge” enjoyed and controlled by companies, institutions, governments, and elites. This transfer is not new or unique to social media; it occurs whenever new communications technologies take hold. Innis himself researched these changes in his book Empire and Communications, which began by exploring the impacts of moving from stone tablets to adopting papyrus in ancient Egypt and on through millenia to the printing press; each new change in media put one empire in decline and gave rise to a successor (Innis 1972). Today, that successor is the ”empire of the customer,” whose knowledge, values, and tastes increasingly influence one another, and influence marketing and advertising every day.
For Drucker, the newest new world was marked, above all, by one dominant factor: “the shift to a knowledge society.”
Indeed, Drucker had been anticipating this monumental leap – to an age when people would generate value with their minds more than with their muscle – since at least 1959, when in Landmarks of Tomorrow he first described the rise of “knowledge work.” Three decades later, Drucker had become convinced that knowledge was a more crucial economic resource than land, labor, or financial assets, leading to what he called a “post-capitalist society.” And shortly thereafter (and not long before he died in 2005), Drucker declared that increasing the productivity of knowledge workers was “the most important contribution management needs to make in the 21st century.”
Today, The Telegraph exclusively reveals the outcome of the world’s biggest-ever investigation into Murphy’s Law, which states that, if things can go wrong, they will go wrong. In a mass experiment carried out in schools across the country, schoolchildren put toast on to plates, and watched what happened when the slices slid off. And they proved beyond reasonable doubt that Murphy’s Law is at work at the breakfast table.
Of almost 10,000 trials, toast landed butter-side down 62 per cent of the time – far more often than the 50 per cent predicted by sceptical scientists. Based on so broad a study, the probability of achieving so big a difference by chance alone is vanishingly small. So, for optimists everywhere, there is no escaping the bleak reality of Murphy’s Law.
For me, as a confirmed pessimist and scientific consultant to the project, the outcome of the mass experiment marks the end of a seven-year quest to confirm my darkest suspicions: that things go wrong because the universe is made that way.
We examine how susceptible jobs are to computerisation. To assess this, we begin by implementing a novel methodology to estimate the probability of computerisation for 702 detailed occupations, using a Gaussian process classifier. Based on these estimates, we examine expected impacts of future computerisation on US labour market outcomes, with the primary objective of analysing the number of jobs at risk and the relationship between an occupation’s probability of computerisation, wages and educational attainment. According to our estimates, about 47 percent of total US employment is at risk. We further provide evidence that wages and educational attainment exhibit a strong negative relationship with an occupation’s probability of computerisation.
As a lifelong consumer used to spending large amounts of money to obtain food, stuff, and entertainment, it’s hard to imagine how it’s possible to spend practically nothing on furniture, a few dollars on clothing, very little on food, almost nothing on transport, and generally less on rent/mortgage.
However, it’s possible to live on a third or even a quarter of the median income, putting one solidly below the government defined poverty line, without living in austerity and eating grits. There is no reason to pay “retail.” You can enjoy the fun of beating the system that exists to take your money and live a middle-class lifestyle on a quarter of the usual numbers. But why aim low? Why not live an upper-class lifestyle and think of yourself as a poor aristocrat? It requires a somewhat different approach, though, and it requires some skill. It also requires a reprogramming of “the way we’ve always done it,” or, rather, the way we usually do it.
You can not write knowledge down on paper, nor can you relay knowledge to other people in a talk or a lecture. Knowledge strictly exists inside people’s heads. Knowledge is therefore a private matter and all a teacher can do is to facilitate the student’s process of forming this knowledge in his head. The idea that teachers pour knowledge into students heads as if it was some kind of product is founded on a complete misunderstanding of how the human mind works. Brains are not computers.
The social aggregate consequence of this is that the amount of knowledge is strictly proportional to the amount of time people spend thinking about information.
This is a moonshot challenge, akin to the Human Genome Project in scope. The scientific value of recording the activity of so many neurons and mapping their connections alone is enormous, but that is only the first half of the project. As we figure out the fundamental principles governing how the brain learns, it’s not hard to imagine that we’ll eventually be able to design computer systems that can match, or even outperform, humans.
If I were a character in a computer game, I would also discover eventually that the rules seemed completely rigid and mathematical. That just reflects the computer code in which it was written.
Human Brain Projects
EU Human Brain Project Blue Brain Project Israeli scientists help digitize brain
US BRAIN Initiative NIH Human Connectome Project A call for ‘brain observatories’
New ‘moonshot’ effort to understand the brain brings artificial intelligence closer to reality
Organization for human brain mapping 3-D Map of the brain The glass brain
A roundtable with the Kavli Institute for Fundamental Neuroscience
Boosting synaptic plasticity to facilitate learning (DARPA)
Mapping the brain to build better machines
Kavli Institute for Brain and Mind (UCSD) The Allen Brain Atlas The Harvarrd whole brain atlas
Japan Brain/MINDS Project China brain project to be launched Russia’s 2045 initiative
Australian Brain Initiative
New research replicates the folding of a fetal human brain
Identifying the brain’s essential elements
Protein imaging reveals detailed brain architecture
An accessible approach to making a mini-brain
To digitize a brain, first slice 2000 times with a very sharp blade
CraMs: Craniometric Analysis application using using 3D skull models
Minimally invasive ‘stentrode’ shows potential as neural interface for brain
Reliable Neural Interface Technolgy (RE-NET)
Comparative mammalian brain collections
Jawless fish brains more similar to ours than previously thought
Brain science and brain-like intelligence have become a hot issue in the world in recent years. While the U.S. and EU have launched their own relevant research programs, the Chinese government also places great importance on brain study, and China’s brain project will be started, a report on People’s Daily said, on Monday.
The traditional intelligence technology now can hardly meet the requirements of the vast amounts of complex information processing in the modern information society. With the development of brain cognition and neuroscience, academics realized that intelligence technology can draw inspiration from brain science and neuroscience in order to develop new theories and methods to improve the machine intelligence level.
Experts pointed out that the deep integration of brain science and intelligence technology will greatly promote the breakthroughs and development of brain-like intelligence research, lead the future direction of artificial intelligence development, reshape a country’s industry, military, and service structure, and become an important manifestation of a nation’s core competitiveness.
Brain science has attracted global attention in recent years as an area of science with the unique potential to revolutionize human psychology, cure mental illness, and transform society with next-generation technologies. The US, Europe, and other countries have committed major support for large-scale brain research programs.
Japan’s effort, called Brain Mapping by Innovative Neurotechnologies for Disease Studies (Brain/MINDS), is supported by the Ministry of Education, Science, and Technology (MEXT). The RIKEN Brain Science Institute (BSI) in Wako, Saitama Prefecture, will serve as the project’s core administrative and research facility.
Scientifically, the Brain/MINDS project will address a fundamental question in neuroscience: how does the human mind work? The project’s goal is to accelerate the development of technologies for mapping the brain’s circuitry in animal models and connecting the results to the diagnosis and treatment of human mental illness.
Clearly, business buzzwords and phrases like “leverage,” “best practices,” and “giving 110 percent” suck total ass. These expressions are — from what I understand — the only thing you learn in MBA programs besides how to shake your classmate’s CEO dad’s hand firmly and how to identify various yachts by the makeup of their kitchen staff alone. But even as corporate jargon gets a terrible, horrible, no good, very bad rap, some of it can prove useful.
David Hanson: Do you want to destroy humans? Please say No.
Sophia: OK. I will destroy humans.
Without most of us noticing, our everyday activities — everything from getting cash at an ATM to watching this program — depend on satellites in space. And for the U.S. military, it’s not just everyday activities. The way it fights depends on space. Satellites are used to communicate with troops, gather intelligence, fly drones and target weapons. But top military and intelligence leaders are now worried those satellites are vulnerable to attack. They say China, in particular, has been actively testing anti-satellite weapons that could, in effect, knock out America’s eyes and ears.
No one wants a war in space, but it’s the job of a branch of the Air Force called Space Command to prepare for one. If you’ve never heard of Space Command, it’s because most of what it does happens hundreds even thousands of miles above the Earth or deep inside highly secure command centers. You may be as surprised as we were to find out how the high-stakes game for control of space is played.
The research being done at the Starfire Optical Range in Albuquerque, New Mexico, was kept secret for many years and for a good reason which only becomes apparent at night.
U.S. space systems are some of the most vulnerable military assets the Defense Department has. A small chuck of metal can easily disable a satellite, as can lasers and other electronic weapons. On the ground, a cyber-attack could cut off the military’s ability to communicate with and control assets in space.
And a spy satellite becomes useless if someone spray paints over the camera — an actual offensive tactic that’s been discussed among space experts.
In 1985, the U.S. was first able to destroy a satellite in orbit by launching a missile from a high-flying F-15 Eagle.
China and Russia have followed suit. In 2007, the Chinese successfully targeted and destroyed one of their own satellites in orbit, and in 2013 were suspected of testing a ground-based missile launch system to destroy objects in orbit.
And just a few months ago on Nov. 18, 2015, Russia successfully tested its own satellite destroying missile.
|Explicit knowledge – academic knowledge or ‘‘know-what’’ that is described in formal language, print or electronic media, often based on established work processes, use people-to-documents approach||Tacit knowledge – practical, action-oriented knowledge or ‘‘know-how’’ based on practice, acquired by personal experience, seldom expressed openly, often resembles intuition|
|Work process – organized tasks, routine, orchestrated, assumes a predictable environment, linear, reuse codified knowledge, create knowledge objects||Work practice – spontaneous, improvised, web-like, responds to a changing, unpredictable environment, channels individual expertise, creates knowledge|
|Learn – on the job, trial-and-error, self-directed in areas of greatest expertise, meet work goals and objectives set by organization||Learn – supervisor or team leader facilitates and reinforces openness and trust to increase sharing of knowledge and business judgment|
|Teach – trainer designed using syllabus, uses formats selected by organization, based on goals and needs of the organization, may be outsourced||Teach – one-on-one, mentor, internships, coach, on-the-job training, apprenticeships, competency based, brainstorm, people to people|
|Type of thinking – logical, based on facts, use proven methods, primarily convergent thinking||Type of thinking – creative, flexible, unchartered, leads to divergent thinking, develop insights|
|Share knowledge – extract knowledge from person, code, store and reuse as needed for customers, e-mail, electronic discussions, forums||Share knowledge – altruistic sharing, networking, face-to-face contact, videoconferencing, chatting, storytelling, personalize knowledge|
|Motivation – often based on need to perform to meet specific goals||Motivation – inspire through leadership, vision and frequent personal contact with employees|
|Reward – tied to business goals, competitive within workplace, compete for scarce rewards, may not be rewarded for information sharing||Reward – incorporate intrinsic or non-monetary motivators and rewards for sharing information directly, recognize creativity and innovation|
|Relationships – may be top-down from supervisor to subordinate or team leader to team members||Relationships – open, friendly, unstructured, based on open, spontaneous sharing of knowledge|
|Technology – related to job, based on availability and cost, invest heavily in IT to develop professional library with hierarchy of databases using existing knowledge||Technology – tool to select personalized information, facilitate conversations, exchange tacit knowledge, invest moderately in the framework of IT, enable people to find one another|
|Evaluation – based on tangible work accomplishments, not necessarily on creativity and knowledge sharing||Evaluation – based on demonstrated performance, ongoing, spontaneous evaluation|
1960s through late 1990s – While representing long-time client Philip Morris, Ruder Finn was instrumental in crafting the public relations campaign that disputed the evidence tobacco smoking is hazardous to health.
1997 – Ruder Finn ran the Global Climate Coalition, a group of mainly United States businesses opposing action to reduce greenhouse gas emissions.
1998 – Caught in conflict of interest as discoveries of financial dealings of Swiss authorities post-World War II surfaced which involved some of their Jewish clients.
2005 – Pro bono work done for the UN raised speculation when Kofi Annan’s nephew, Kobina, worked as an intern at the firm.
2012 – Ruder Finn accepts contract worth £150,000 per month by current government of Maldives that is currently being condemned by many nations and organizations (including the Commonwealth) for organizing a political coup d’état that led to the fall of the first democratically elected President of the Maldives.
Zagar Communications is the first full-service public relations firm dedicated to helping organizations succeed in the Myanmar market.
It was founded by a team of professionals with a cumulative 60+ years experience in public relations, emerging markets and finance in Southeast Asia with some of the most well-known names in international communications.
The United States Justice Department announced this week that it was able to unlock the iPhone used by the gunman in the San Bernardino shooting in December, and that it no longer needed Apple’s assistance. F.B.I. investigators have not said how they were able to access the smartphone, but a law enforcement official said that a company outside the government had helped them hack into the operating system.
Should hackers help the government?
- humans have the largest brain of all animals. ==> bollocks.
- humans employ intelligent use of tools. ==> bullshit.
- humans are social animals. ==> human society is absolutely embarrassing.
- humans show emotions. ==> octopuses show emotions.
- humans are the only creatures who communicate using language. ==> dolphins have quite a complicated language; ants use thousands of chemicals to convey information.
- humans are the most intelligent of all creatures. ==> there is no evidence to show that humans have cognitive skills that are unique to us.
- humans can cut an onion, but an onion can hardly cut us ==> a man can kill a lion more easily than a lion can kill a man.
As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups. We stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience. Once we got comfortable with how Tay was interacting with users, we wanted to invite a broader group of people to engage with her. It’s through increased interaction where we expected to learn more and for the AI to get better and better.
The logical place for us to engage with a massive group of users was Twitter. Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay. Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images.
- humans have the largest brain of all animals – bollocks. the sperm whale has a brain weighing 20 pounds. a human brain weighs about 3.
- humans employ intelligent use of tools – it is bullshit. chimpanzees lick the end of a stick and put it in an anthill to catch ants. oh, hell, we don’t have to go to such “advanced” species. just look at the spider’s web.
- humans are social animals – human society is absolutely embarrassing compared to many insect societies. look at ants. or bees. who is more social? humans or bees?
- humans are the only creatures who communicate using language – it is well known that dolphins have quite a complicated language. actually, ants use thousands of chemicals to convey information.
- humans are the most intelligent of all creatures – there is no evidence to show that humans have cognitive skills that are unique to us. dolphins, for example, are known to be extremely intelligent. there is no acceptable universal test for human intelligence yet.
Arguments against the significance of the Toba super-eruption for abrupt, catastrophic climate change and population reduction, have provided the opportunity for a detailed reexamination and restatement of the main points of the hypothesis of Toba’s relationship to the human population bottleneck, and to cultural developments. In summary:
- The eruption was significantly larger than previously estimated, and caused a millennium of the coldest temperatures of the Upper Pleistocene.
- The population bottleneck was real rather than “putative”, and it occurred during the first half of the last glacial period. Mass extinctions were not a feature of this event.
- Capacities for modern human behavior were undoubtedly present during the last interglacial, but the stable environments of this period did not foster widespread adoption of the strategic cooperative skills necessary for survival in the last glacial era.
Are terrorists more of a threat than slippery bathtubs? 464 people drowned in America in tubs, sometimes after falls, in 2013, while 17 were killed here by terrorists in 2014.
The basic problem is this: The human brain evolved so that we systematically misjudge risks and how to respond to them.
Unfortunately, our brains are not well adapted to most of the biggest threats we actually face in the 21st century. Warn us that climate change is destroying our planet, and only a small part of our prefrontal cortex will glimmer; then we’ll go back to worrying about terrorists.
Our brains are perfectly evolved for the Pleistocene, but are not as well suited for the risks we face today.
Do you believe in God?
Late nineteenth-century German philosophers used the word Einfühlung, later translated as empathy, when discussing aesthetics. One of the earliest appearances of the word was in 1846. Philosopher Robert Vischer used Einfühlung to discuss the pleasure we experience when we contemplate a work of art. The word represented an attempt to describe our ability to get ‘inside’ a work of beauty by, for example, projecting ourselves and our feelings ‘into’ a painting, a sculpture, a piece of music, even the beauty of nature itself.
‘For the romantic,’ comments Stueber, ‘nature is properly understood only if it is seen as an outward symbol of some inner spiritual reality.’ As the work of art or the beauty of nature resonates with us, the feelings generated are projected into, and then felt to be a quality of, that work of art, that glorious nature.
If we can ‘feel our way into’ a work of art in an act of empathy, our understanding increases and our appreciation deepens. With particularly powerful works of art, we feel ourselves reacting both viscerally and emotionally. As our bodies resonate with the flow of the paint, the pain of a face, the strength of a buttress, the flight of a spire, our feelings vibrate in tune with the emotions of the work we are contemplating. We have an aesthetic experience. We are moved in our contemplation of a sensuous object.
Empathy not only entails knowing what a person is feeling and feeling what a person is feeling, but also communicating, perhaps with compassion, the recognition and understanding of the other’s emotional experience.
The experience of your own body as your own subjectivity is then applied to the experience of another’s body, which, through apperception, is constituted as another subjectivity. You can thus recognise the Other’s intentions, emotions, etc. This experience of empathy is important in the phenomenological account of intersubjectivity. In phenomenology, intersubjectivity constitutes objectivity (i.e., what you experience as objective is experienced as being intersubjectively available – available to all other subjects. This does not imply that objectivity is reduced to subjectivity nor does it imply a relativist position, cf. for instance intersubjective verifiability).
In the experience of intersubjectivity, one also experiences oneself as being a subject among other subjects, and one experiences oneself as existing objectively for these Others; one experiences oneself as the noema of Others’ noeses, or as a subject in another’s empathic experience. As such, one experiences oneself as objectively existing subjectivity. Intersubjectivity is also a part in the constitution of one’s lifeworld, especially as “homeworld.”
After decades as the cultivated interest of scholars in philosophy and in clinical and developmental psychology, empathy research is suddenly everywhere! Seemingly overnight it has blossomed into a vibrant, multidisciplinary field of study and has crossed the boundaries of clinical and developmental psychology to plant its roots in the soil of personality and social psychology, mainstream cognitive psychology, and cognitive-affective neuroscience.
If we did not make judgments of what other people know, how they feel, and what they are likely to do in speciﬁc situations, communication would be impossible. Writers have to gauge their expositions to the level of relevant background knowledge expected of their intended audiences. Speakers in everyday conversation must make assumptions about what the other parties to the conversation do and do not know in order to ensure that what they say will be understood.
You never really understand a person until you consider things from his point of view… Until you climb inside of his skin and walk around in it.
I believe that there’s a difference between knowing something and understanding it. You know how you’ll try to communicate something very important to you to another person and sometimes they’ll wave you off with an impatient, “I know, I know”? That’s knowing: I got the gist, filed it away, I don’t need to think about it again. Knowing is comprehension; understanding is deeper because it comes from empathy or identification.
Empathy is the capacity to think and feel oneself into the inner life of another person.
The empathic understanding of the experience of other human beings is as basic an endowment of man as his vision, hearing, touch, taste and smell
The art of imaginatively stepping into another person’s shoes and seeing the world from their perspective is, it would seem, a most valuable and valued twenty-first century asset.
Not so, says Yale psychologist Paul Bloom, leading the counter-charge against empathy’s popularity surge. Bloom creates a false – and dangerous – dichotomy between empathy and reason, and misses the long lesson from history: that time and again, empathy has played a crucial role in creating a democratic culture that respects human rights. So where have the critics gone wrong? …
If affective empathy is our mirror for reflecting others’ emotions, cognitive empathy is, by contrast, a pair of shoes that invites us to imagine the world from their viewpoint.
Cast empathy aside to lean on reason alone and we would become emotionally tone deaf and politically indifferent. That is not who we want to be and – more importantly – it is not who we are.