Tumgik
#its how the dataset is gathered for me. and who benefits from it.
bigsnorp · 8 months
Text
imo the recent wave of "ai art is valid actually" discourse often misses the point of the reaction against ai art. And I think the reaction is often reacting against the wrong thing too.
The end product is not the problem. Like it or not, AI generated art is now a valid subset of art. It makes something that wasn't there before, usually has some purpose or intent in the process, and it makes people feel something. Hating it is still something.
El problema, como siempre, es el capitalismo. Just like that old man said.
Creating an AI model always involves training it on a dataset. The bigger the better, every single time. How do you do that? You write a program to scrape everything from whatever you point it at, and run it through various levels of processing until your AI reliably spits out something you want to see.
This is hard to do, so there is a lot of monetary value in creating one, hosting it, and offering it to others. At no point do you have to create anything of value, you just have to take what other people made and extract surplus value from it. And you do this while interacting as little as possible with the art you've used for this massive dataset, or god forbid with the artists that made it.
In return these artists get nothing, usually. Not so much as a note of acknowledgement in the readme.
Human artists can and do steal from, plagiarise, and piss off other artists all the time. But they cannot do it at the scale of AI art, nor can they completely divorce the original art from its context. Even when this is done for financial gain, it's on an entirely different level, with entirely different stakes. I'm not talking about copyright law or intellectual property, I'm talking about large scale labour devaluation and deskilling for the sake of corporate capitalism which is an inherent part of most of the AI generative art tools that are used by people today.
There's a lot of cool possibilities and interesting thought-provoking questions that have become immediately obvious with AI art and it's not in any rush to go away. I think spending hours debating the artistic merit of the end product of AI generated art is the wrong thing to focus on right now. Humans also make art that is uninspired or derivative or mass-produced and it can still be artistically valuable. Look at Duchamp's fountain, literally mass produced and likely plagiarised from another artist, and one of the most valuable pieces of modern art history we have.
It's about the scale. The decontextualisation. The capital gain for a generally uninterested company seeking to extract surplus value from the unconsenting labour. The picture that gets generated at the end is kind of the least interesting part of the discussion.
30 notes · View notes
uang888-win · 3 days
Text
uang888win
Odds of winning big are higher when you wager against individuals from different nations. This is where uang888win really shines. All of that seems unbelievable to me. Look at that beautiful 888 marshmallow! You may be sure that we will notify you when the time is right. Because of its genuine and funny tone, the content is quite popular. A second email will be sent to you regarding their invites to join. People who would prefer watch cricket matches online also have the option of using streaming services uang888 On top of that, you may play these games online, so there's no need to leave your house. Before you start, read them carefully to make sure you have considered everything. Making a positive impact on people's lives increases our reputation. I would venture to suggest that most individuals have grasped the concept by now. From one vantage point, its look could change dramatically. It makes no difference to me how many people think this way. You should gauge public opinion before releasing a statement. You can always count on our friendly online casino to be there for you. We gave serious thought to and ultimately implemented all of our clients' suggestions for how to improve our gaming establishment. 
Tumblr media
 We never stopped thinking about what our clients wanted as we built our casino. Without incurring any losses, we were able to launch our casino. Building a public gaming facility was one of our many plans. If the casino could not obtain such massive quantities of money, it would have a devastating effect on their operations. Here we shall look into the origins of the problem. This assertion has some validity since it is based on my own experiences. An online casino's customer service procedures provide a window into its personality. Whatever it takes, we will do our best to make sure you have an unforgettable time. In order to succeed academically, you must exercise extreme caution with these details. Before you leave, gather everything you'll need to finish these chores. You may be certain that you will always be on my mind and in my prayers. Making you happy is our number one priority, and we won't stop until we succeed. You will have achieved your aim at last. Our chances of success are extremely low unless you help us. Expecting a different viewpoint in light of everything that has transpired is completely ridiculous. The competition's second-to-last and crucial stage has started. It would be much easier if your company's headquarters were in Indonesia. Typically, this is the outcome. The choice could have both short-term and long-term effects. In some cases, this is more probable to happen. What sets it apart is the result. What happens after that is entirely up to you. There are two possible results that could result from each of these events. Due to their inherent dissimilarities, the two domains cannot coexist. There are noticeable differences between the two when compared head-to-head. The disparities are magnified when the two datasets are compared simultaneously. Ignoring these enormous possibilities might lead to total chaos. For the time being, these are the matters that concern us the most. You may now connect the two options. At some point, such things will inevitably occur. Just by including all the necessary details, the review's quality will be greatly enhanced. When people collaborate, everyone benefits. You will always take my advice into account while making a decision. Help us out, please. How critical do you think it is that I leave this place? The truth is that it doesn't bother me in the slightest. In the moments leading up to starting, tell yourself that you can do all you set your mind to. We would be eternally grateful to you if you could lend us a hand. Your assistance is greatly appreciated. You have my undying gratitude. Absolutely, I get it. If it were to crash onto Earth, the known universe would instantly implode. You have my undying gratitude. Not a problem at all. How critical do you think it is that I leave this place? The truth is that it doesn't bother me in the slightest. In the moments leading up to starting, tell yourself that you can do all you set your mind to. We would be eternally grateful to you if you could lend us a hand. Your assistance is greatly appreciated. You have my undying gratitude. Not a problem at all. Make sure everything is in order before moving forward. Stay steadfast on the course you've chosen. Playing this game has numerous benefits, one of which is the option to bet on multiple marketplaces at once. This category encompasses a wide range of niche markets, such as those for cricket, casinos, sportsbooks, and curries, among many others. Among them is Uang888!.
1 note · View note
lumateranlibrarian · 5 years
Text
Don’t you just hate it when people screw you over, and when you confront them about it, they profusely apologize but don’t bother explain themselves?
Like, I’m willing to give people the benefit of the doubt. But if you know you’re in the wrong, or don’t have a way to justify the bullshit you’re pulling... then why the hell are you trying to bullshit in the first place?
I don’t really care about apologies. I’ll easily admit I’m not the best at giving them. I care more about results, though, and when it’s both of our asses on the line, if you can’t pull your weight as a grown-ass adult, you shouldn’t be so surprised when I say I’m going on without you.
For context: we’re both graduate students with almost two years worth of classes and research under our belts. I should not have to be responsible for my partner’s motivation and contributions when she hasn’t done shit for three weeks. It’s only now, when I tell her I’m done waiting for her to contribute and I’m taking charge of this project with a week left before we have to present it, that she decides to post everything on a shared document and powerpoint that I made for us to work off of weeks ago.
And here’s the thing. I gave her the absolute easiest tasks. Mostly because I didn’t know her or have ever worked with before. I asked her to write the methodology portion of our paper. Literally one page of text to talk about how the data was gathered, where it came from, and what the basic distribution of the data was (aka missingness, simple differences between groups, etc).
She texted me a picture of two (2) graphs.
I was the one who talked with the professor about getting the dataset. I was the one who asked the professor for help formatting properly (because SAS is a hell program, sorry I’m an R girl through and through). I’m the one who made the shared documents for us to work off of, came up with not one but two possible research questions, looked at our variables and decided which ones were reliable and interesting enough to use.
She only started putting things on our powerpoint tonight, after I told her I was done waiting around for her to contribute. When I asked why she hadn’t done anything yet, she said she was “busy”. Busy? Really? I was busy too. I was extremely busy, having missed class from having shingles, having midterms and finals and papers and presentations and meetings, all to organize and prepare. But somehow I managed to open up the freaking SAS document without needed to be reminded that it existed.
And here’s the thing. I have it from THREE DIFFERENT SOURCES that this isn’t the first time she’s pulled this shit.
One is a fellow student whose partner for the same project was partnered with her last semester on a different project. He says that his partner was approached by This Girl for the current project, and advised him not to accept the partnership because she’d flaked on a project they’d worked on “together” in a previous class.
One is my own faculty mentor, my research advisor and lab PI. Last week, I was in his office trying not to burst into tears because I was losing my grip on all the papers, projects, and finals I had to complete within a span of about five weeks (a homework assignment, a paper, a final presentation, and a midterm each for two different classes, a manuscript for my first first-author publication, and preparing for a committee meeting that will directly feed into my dissertation work, while attending regular classes and seminars). And when I mentioned my frustration with This Girl’s ability to contribute or even conceptualize the very simple problems I was talking about, he admitted that she had taken his basic epidemiology class and that she was one of his worst students. I didn’t ask for specifics, but his words were, “yeah, I was worried about her when she took my class.”
AND THEN. IF MY GODDAMN RESEARCH MENTOR SAYING SHE WAS A CRAPPY PARTNER WASN’T ENOUGH.
One of my best friends had a bridal shower yesterday (super fun, met a lot of lovely folks, got to make toilet paper wedding dresses on people), I mentioned I was having trouble with This Girl as a project partner, and when I named This Girl to my friend, she revealed that This Girl used to work in the same lab as her, but got fired because she never showed up to work and never completed anything.
So I’m just like.
Okay. 
And I’m supposed to rely on this person?
Of course not. It’s laughable.
It doesn’t make any difference that she’s a masters student and I’m on the PhD track. We’ve both had at least an undergraduate education, which is pretty strenuous on its own, so we both should have some understanding of what it means to pull one’s own weight on a substantial class project. But it’s not just that, it’s being an adult who doesn’t need their hand held and their ass wiped to contribute on a project. I’m sorry, This Girl, but I have standards and I have a grade I need to make to pass this class.
You’re worried I’m going to finish this project without you? At this point, it’s really not my problem.
3 notes · View notes
frankkjonestx · 4 years
Text
This year’s SN 10 scientists aim to solve some of science’s biggest challenges
In the midst of a pandemic that has brought so much worry and loss, it’s natural to want to help — to do some small part to solve a problem, to counter pain, or to, importantly, remind others that there is beauty and wonder in the world. Scientists have long been doing just that. Many are chasing answers to the myriad challenges that people face every day, and revealing the rewards in the pursuit of knowledge itself. It’s in that spirit that we present this year’s SN 10: Scientists to Watch.
For the sixth consecutive year, Science News is featuring 10 early- and mid-career scientists who are pushing the boundaries of scientific inquiry. Some of the researchers are asking questions with huge societal importance: How do we prevent teen suicide? What are the ingredients in wildfire smoke that are damaging to health? Is there a better way to monitor earthquakes to save lives? What about finding new ways to diagnose and treat diseases?
Others are trying to grasp how weird and wonderful the natural world is — from exploring how many supermassive black holes are out there in space to understanding the minuscule genetic details that drive evolution. For instance, SaraH Zanders, one of this year’s SN 10, is unveiling the drama that unfolds when life divvies up its genetic material.
A couple of the scientists on this year’s list have also taken steps to support people from groups that are underrepresented in the sciences. These researchers see how science benefits when people from diverse backgrounds contribute to the pursuit of answers.
All of this year’s honorees are age 40 and under, and all were nominated by Nobel laureates, recently elected members of the U.S. National Academy of Sciences or previous SN 10 scientists. The world feels very different than it did at the start of 2020, when we first put out our call for SN 10 nominations, but the passion these scientists have for their work endures. The curiosity, creativity and drive of this crew offers hope that we can overcome some of our biggest challenges.
Though it often takes time, out of crisis comes action. Also out of crisis comes a renewed appreciation for small pleasures that give life meaning. These researchers find joy in the search for scientific answers. Here’s how Zanders describes what motivates her work: “It’s just I like to solve puzzles.” — Elizabeth Quill
The 2020 SN 10: Scientists to Watch
Tumblr media
Tonima Tasnim Ananna
Tumblr media
Alessandra Corsi
Tumblr media
Emily Fischer
Tumblr media
Prashant Jain
Tumblr media
Anna Mueller
Tumblr media
Phiala Shanahan
Tumblr media
Mikhail Shapiro
Tumblr media
Bo Wang
Tumblr media
SaraH Zanders
Tumblr media
Zhongwen Zhan
Black hole hunter seeks a cosmic census
Tumblr media
Credit: Eli Burakian/Dartmouth College
Tonima Tasnim Ananna, 29 Astrophysicist
Affiliation: Dartmouth College Hometown: Dhaka, Bangladesh Favorite black hole: Cygnus X-1
Standout research
Tonima Tasnim Ananna is bringing the heaviest black holes out of hiding. She has drawn the most complete picture yet of black holes across the universe — where they are, how they grow and how they affect their environments. And she did it with the help of artificial intelligence.
As far as astronomers can tell, nearly every galaxy stows a black hole at its center, weighing millions or billions of times the mass of the sun. Though these supermassive black holes can heat surrounding material until it glows brighter than all the galaxy’s stars combined, the light can be concealed by gas and dust also drawn in by the black hole’s pull. High-energy X-rays cut through that dusty veil. So for her Ph.D., completed in 2019, Ananna gathered surveys from four X-ray telescopes, more datasets than any previous study had used. Her goal was to create a model of how black holes grow and change across cosmic history. “It was supposed to be a short paper,” Ananna says. But models that explained one or a few of the datasets didn’t work for the full sample. “It stumped us for some time.”
To break the gridlock, she developed a neural network, a type of artificial intelligence, to find a description of the black hole population that explained what all the observatories saw. “She just went off and taught herself machine learning,” says astrophysicist Meg Urry of Yale University, Ananna’s Ph.D. adviser. “She doesn’t say, ‘Oh, I can’t do this.’ She just figures out a way to learn it and do it.” One early result of the model suggests that there are many more active black holes out there than previously realized.
Big goal
Black holes could be gobbling down gas as fast as theoretically possible.
Galaxies live and die by their black holes. “When a black hole puts out energy into the galaxy, it can cause stars to form,” Ananna says. “Or it could blow gas away,” shutting down star formation and stunting the galaxy’s growth (SN: 3/31/20). So understanding black holes is key to understanding how cosmic structures — everything from galaxy clusters down to planets and perhaps even life — came to be. Ananna’s model is built on data describing black holes at different cosmic distances. Because looking far in space is like looking back in time, the model shows how black holes grow and change over time. It could also help figure out how efficiently black holes eat. Early hints suggest black holes could be gobbling down gas as fast as theoretically possible, which may help explain how some got so big so fast (SN: 3/16/18).
Inspiration
When Ananna was a 5-year-old in Dhaka, Bangladesh, her mother told her about the Pathfinder spacecraft landing on Mars. Her mother was a homemaker, she says, but was curious about science and encouraged Ananna’s curiosity, too. “That’s when I realized there were other worlds,” she says. “That’s when I wanted to study astronomy.” There were not a lot of opportunities to study space in Bangladesh, so she came to the United States for undergrad, attending Bryn Mawr College in Pennsylvania. She chose an all-women’s school not known for a lot of drinking to reassure her parents that she was not “going abroad to party.” Although Ananna intended to keep her head down and study, she was surprised by the social opportunities she found. “The women at Bryn Mawr were fiercely feminist, articulate, opinionated and independent,” she says. “It really helped me grow a lot.” Traveling for internships at NASA and CERN, the European particle physics laboratory near Geneva, and a year at the University of Cambridge, boosted her confidence. (She did end up going to some parties — “no alcohol for me, though.”)
Now, Ananna is giving back. She cofounded Wi-STEM (pronounced “wisdom”), a mentorship network for girls and young women who are interested in science. She and four other Bangladeshi scientists who studied in the United States mentor a group of 20 female high school and college students in Bangladesh, helping them find paths to pursue science. — Lisa Grossman
Back to SN 10 list
Pioneer pairs light with gravity waves
Tumblr media
Credit: Texas Tech Univ.
Alessandra Corsi, 40 Astrophysicist
Affiliation: Texas Tech University Hometown: Rome, Italy Favorite telescope: Very Large Array, New Mexico
Standout research
On September 3, 2017, Alessandra Corsi finally saw what she had been waiting for since mid-August: a small dot in her telescope images that was the radio afterglow of a neutron star collision. That stellar clash, discovered by the Advanced Laser Interferometer Gravitational-Wave Observatory team, or LIGO, which included Corsi, was the first direct sighting of a neutron star collision (SN: 10/16/17). The event, dubbed GW170817, was also the first of any kind seen in both gravitational waves and light waves.
Telescopes around the world spotted all kinds of light from the crash site, but one particular kind, the radio waves, took their sweet time showing up. Corsi had been waiting since August 17, when the gravitational waves were spotted. “Longest two weeks of my life,” Corsi says. The radio waves were key to understanding a superfast particle jet launched by the colliding stars.
Early on, the jet appeared to have been smothered by a plume of debris from the collision (SN: 12/20/17). But follow-up radio observations made by Corsi’s team and others confirmed that the jet had punched through the wreckage (SN: 2/22/19). This jet was the first of its kind to be seen from the side, allowing Corsi and colleagues to probe its structure. The jet almost certainly would have gone unnoticed if the gravitational waves hadn’t clued astronomers in.
Big goal
Corsi is a pioneer in the new field of multimessenger astronomy, which pairs observations of light waves with spacetime ripples, or gravitational waves. The pairing is like having eyes and ears on the cosmos, Corsi says. “You cannot learn all that you could with only one of the two.” In the case of GW170817, gravitational waves revealed how the neutron stars danced around each other as they spiraled toward collision, and light waves unveiled the type of material left in the aftermath (SN: 10/23/19). Using this multimessenger approach could also give astronomers a more complete picture of other cataclysms, such as smashups between neutron stars and black holes, and the explosive deaths of massive stars. Such spectacular events “reveal some of the most fundamental physics in our universe,” Corsi says.
If gravitational wave signals were converted into sound, they would create their own kind of music.
Most researchers specialize in either gravitational waves or light, but Corsi “is very well-versed in both messengers,” says Wen-fai Fong, an astrophysicist at Northwestern University in Evanston, Ill. “That makes her extremely versatile in terms of the types of multimessenger science she can study.”
What’s next
Corsi has now built a computational tool to scan LIGO data for gravitational waves stirred up by whatever is left behind in a neutron star merger. The tool is based on a paper she published in 2009 — years before LIGO scored its first gravitational wave detection (SN: 2/11/16). The paper describes the gravitational wave pattern that would signal the presence of one possible remnant: a rapidly spinning, elongated neutron star. Alternatively, a neutron star smashup could leave behind a black hole. Knowing which “tells us a lot about how matter behaves at densities way higher than we could ever explore in a lab,” Corsi says.
Inspiration
Corsi taught herself to play the piano in high school, and now enjoys playing both classical music and tunes from favorite childhood movies, like Beauty and the Beast. The audio frequencies of piano notes are similar to the frequencies of spacetime tremors picked up by LIGO. If gravitational wave signals were converted into sound, they would create their own kind of music. “That’s the thing I like to think of when I’m playing,” she says. — Maria Temming
Back to SN 10 list
What’s in smoky air?
Tumblr media
Credit: Bill Cotton/Colorado State Univ.
Emily Fischer, 39 Atmospheric chemist
Affiliation: Colorado State University Hometown: Richmond, R.I. Favorite outdoor activities: Cross-country skiing and gardening
Motivation
Emily Fischer has always cared about air pollution. “It’s innate.… It’s a calling,” she says. Exposure to air pollution raises your risk for many common ailments, such as cardiovascular disease, asthma, diabetes and obesity. But unlike some other risk factors for these diseases, “you can’t choose not to breathe, right? You have to have clean air for everyone.” In her youth, she organized rallies to clean up the cigarette smoke–filled air of her Rhode Island high school. That interest led Fischer to study atmospheric chemistry and motivates her current work as a self-described air pollution detective. Air pollution may conjure images of thick black plumes billowing from smokestacks, but Fischer says most air pollution is invisible and poorly understood. She combines analytical chemistry with high-flying techniques to understand where air pollution comes from and how it changes as it moves through the air.
Bold idea
Wildfire smoke like that filling the skies in the American West this season is a major, but still mysterious, source of air pollution. Thousands of different solids, liquids and gases swirl together to form wildfire smoke, and its chemical composition changes as it blows through the atmosphere. This dynamic mixture, which is also affected by what’s burning on the ground, is tricky to measure, since each of its many components requires highly specialized equipment and expertise to assess. The equipment also has to be airborne, typically lofted into the air via planes or balloons. “There has been beautiful work on wildfire smoke,” Fischer says, “but in most studies, we just have not had all the measurements needed to really interpret things.” 
“You can’t choose not to breathe, right? You have to have clean air for everyone.”
Emily Fischer
To get a fuller view, she dreamed big: “Why not try to measure everything, and measure it systematically?” She pulled together a diverse team of 10 lead researchers, and scores more graduate students and postdocs, to pull off the most comprehensive analysis of wildfire smoke ever attempted, a project dubbed WE-CAN. During the summer of 2018, Fischer led over a dozen six-hour flights over the West, chasing wildfire smoke plumes and systematically measuring the air in and around smoke plumes with nearly 30 different instruments crammed into the cargo hold of a C-130 plane.
“[WE-CAN] is a big collaboration,” says Ronald Cohen, an atmospheric chemist at the University of California, Berkeley. He says success stemmed in large part from the team that came together.
“Making an environment for successful collaboration is really satisfying to me,” Fischer says.
While team members are still analyzing the data, the project is already revealing some of the smoke’s secrets. For example, formaldehyde and hydrogen cyanide — two chemicals linked to cancer and other health problems — are abundant in wildfire smoke. Recent wildfires show how important it is to understand the role of climate change in fires, Fischer says, and “who is most vulnerable in our society, and how we can best prepare and protect those communities.”
Fisher is also planning to adapt some of what she’s learned from WE-CAN to track ammonia emissions from farms and feed lots, which are another major source of air pollution.
Big goal
Fischer is deeply committed to bringing more undergraduate women, especially women of color, into the geosciences. And she’s using science to figure out how. She brought a team of social scientists and geoscientists together to study how different interventions can help. She and colleagues found that for every female role model a student has, her probability of continuing on in her geosciences major roughly doubles. Having someone to look up to who looks like them is key to building a sense of belonging and identity as a scientist, Fischer says. To help build that network, Fischer started PROGRESS, a workshop and mentorship program that aims to support undergraduate women in the geosciences. Started at Colorado State University in 2014, the program has since expanded, reaching over 300 women at institutions across the United States.
For her own mentees, Fischer tries to instill a willingness to take risks and go after big, bold questions. “The easy things are done,” she says. Pushing forward our understanding of pressing questions means chasing research projects that might lead nowhere, she says, or might crack open a new field of research. “It’s OK to be wrong, and it’s OK to take risks. That’s what science needs right now.”  — Jonathan Lambert
Back to SN 10 list
Taking chemistry lessons from nature
Tumblr media
Credit: L. Brian Stauffer/UI News Bureau
Prashant Jain, 38 Physical chemist
Affiliation: University of Illinois at Urbana-Champaign Hometown: Mumbai, India Favorite element: Gold
Big goal
Prashant Jain explores how light interacts with matter — such as how plants use sunlight to photosynthesize — and applies that knowledge to new problems. He recently took lessons from nature to convert carbon dioxide into other useful molecules. In a paper last year in Nature Communications, Jain and Sungju Yu, also at Illinois at the time, reported using gold nanoparticles as a catalyst to drive chemical reactions between carbon dioxide and water.
When light hit the nanoparticles, it set off a series of reactions that converted carbon dioxide into hydrocarbon fuels such as methane and propane. In essence, the process not only sucked carbon dioxide — a greenhouse gas — out of the air, but it also made that carbon into fuel. No wonder the oil giant Shell is funding Jain’s work. The whole process isn’t very efficient, so Jain is working to improve how much carbon dioxide gets used and how much fuel gets produced. But along the way he hopes to learn more about how nature uses energy to make matter — and to inspire his lab to create more sustainable and renewable energy technologies.
“I am myself still a student.”
Prashant Jain
In another example of using chemistry to push toward future technologies, Jain and colleagues shined light on gold and platinum nanoparticles and triggered reactions that liberated hydrogen from ammonia molecules. Hydrogen is important in many industries — fuel cells for zero-carbon vehicles use it, for example — but it can be dangerous to transport because it’s flammable. Jain’s discovery could allow workers to transport ammonia instead, which is safer, and then free the hydrogen from the ammonia once it has arrived where’s it needed. The work was reported online in July in Angewandte Chemie.
Superpower
Jain has a remarkable ability and optimism to see unsuccessful laboratory experiments as successful steps toward understanding the natural world, says Karthish Manthiram, a chemical engineer at MIT. As a first-year graduate student at the University of California, Berkeley, Manthiram remembers being frustrated that his experiments weren’t turning out as expected. But Jain, a postdoctoral fellow in the same lab, stepped in to help and recast the problematic results. “He’s always viewed what others see as failure as moments of clarity that build up to moments when things make more sense,” Manthiram says. “For me that was an important lesson in how to be a scientist.”
Inspiration
Growing up in a family that worked mostly in business and finance, Jain fell in love with science as a preteen — inspired in part by watching the movie Jurassic Park and its fictional depiction of what might be possible through understanding the molecular world. Soon he spotted a physics textbook for sale from a street vendor and bought it. “I tried to read the book, nothing much made sense,” he says. “I wanted to be the one to figure out all these mysteries of nature.” He chose to major in chemical engineering in college (inspired in part by a magazine published by the chemical company DuPont), and then switched to physical chemistry when he moved to the United States to get a Ph.D.
Promoted this year to full professor, Jain has never stopped pushing to acquire new knowledge; when he finished teaching this last spring semester, he enrolled in an online MIT course on quantum information science. “I am myself still a student,” he says. — Alexandra Witze
Back to SN 10 list
Challenging ideas about youth suicide
Tumblr media
Credit: Sarah Diefendorf
Anna Mueller, 40 Sociologist
Affiliation: Indiana University Hometown: Houston, Texas Favorite fieldwork: Observing rituals
Standout research
Between 2000 and 2015, at a high school of about 2,000 students in the town of Poplar Grove (a pseudonym), 16 former and current students died by suicide; three other similar-aged individuals in the community, mostly at private schools, also took their own lives. A clinician who had grown up in the town reached out to Anna Mueller for help breaking the cruel cycle. Before that e-mail in fall 2013, Mueller was using big data to understand why teen and young adult suicide rates in the United States were spiking. The U.S. Centers for Disease Control and Prevention estimates that suicides among 10- to 24-year-olds jumped 56 percent between 2007 and 2017.
Scholars theorized that suicidal people attracted other suicidal people. But Mueller’s work undercut that idea. In 2015 in the Journal of Health and Social Behavior, for instance, she reported that merely having a suicidal friend did not increase a teen’s suicide risk. A teen’s risk only went up with awareness that a teenage friend had made a suicide attempt. “Knowledge of the attempt matters to transforming … risk,” Mueller says. She carried an understanding of that contagion effect to Poplar Grove, where she worked with sociologist Seth Abrutyn of the University of British Columbia in Vancouver, the half of the duo who is more focused on the theoretical.
Anna Mueller’s long-term goal is to create a sort of litmus test that identifies schools that could be at risk of a suicide cluster.
The team conducted 110 interviews and focus group meetings, lasting from 45 minutes to four hours, with Poplar Grove residents, plus some individuals outside the community for comparison. The team’s research revealed that teens felt an intense pressure to achieve in their affluent, mostly white town, where everybody seemed to know everyone else. While teens and young adults in a first wave of suicides might have had mental health problems, peers and community members often attributed those deaths to the town’s pressure cooker environment. That narrative, however incomplete, was especially strong when the youth who killed themselves were classic overachievers. Tragically, over time, that script became embedded in the local culture, making even youth who weren’t previously suicidal see suicide as a viable option (SN: 4/3/19), Mueller says.
Mueller and Abrutyn were among the first researchers to start chipping away at the underlying reasons for why suicide rates have been rising in high schoolers, particularly overachieving girls without obvious underlying mental health problems, says Bernice Pescosolido, a sociologist at Indiana University in Bloomington who helped bring Mueller into the school’s sociology department. “What Anna and Seth have really been able to show is how imitation works and what the contagion effect looks like on the ground.”
Big goal
Mueller’s long-term goal is to create a sort of litmus test that identifies schools that could be at risk of a suicide cluster. That way, school and community leaders can intervene before the first suicide and its resulting firestorm. Since fall 2018, she has been researching suicide trends in school districts in Colorado that are more diverse than Poplar Grove. When it comes to school culture, her early work shows, there’s often a trade-off between academic or athletic excellence and a supportive environment.
Top tool
In anticipation of her work in Poplar Grove, Mueller knew she needed a more boots-on-the-ground approach than her big data training allowed. So she trained in qualitative methods, including how to design a study; interview techniques, such as how to write questions to elicit desired conversations; and the detailed data analysis required for this research tactic.  
Mueller also sees the value in observing interactions, a common sociological approach. This spring, with the pandemic in full swing, she spent a lot of time on her home computer watching socially distant graduation ceremonies in her Colorado schools. She found that a school’s culture showed in the details, such as whether valedictorians addressed hot-button issues, such as the Black Lives Matter movement, in their speeches. “Of all of my moments in the field, rituals are the ones that tug at my own heartstrings because I’m watching kids graduate and that’s just inherently beautiful, but it also is a very powerful data moment,” she says. — Sujata Gupta
The National Suicide Prevention Lifeline can be reached at 1-800-273-TALK (8255).
Back to SN 10 list
The inner lives of protons and neutrons
Tumblr media
Credit: P. Shanahan
Phiala Shanahan, 29 Theoretical physicist
Affiliation: MIT Hometown: Adelaide, Australia Favorite subatomic particle: The gluon
Big goal
When Phiala Shanahan was a graduate student, she was shocked to learn that experiments disagreed on the size of the proton (SN: 9/10/19). “Protons and neutrons are the key building blocks of 99 percent of the visible matter in the universe,” she says. “And we know, in some sense, surprisingly little about their internal structure.”
“If there’s something I don’t understand, I’m extremely stubborn when it comes to figuring out the answer.”
Phiala Shanahan
That ignorance inspires her studies. She aims to calculate the characteristics of protons and neutrons based on fundamental physics. That includes not just their size, but also their mass and the nature of their components — how, for example, the quarks and gluons that make them up are sprinkled around inside. Such calculations can help scientists put the standard model, the theory that governs elementary particles and their interactions, to the test.
Standout research
Shanahan is known for her prowess calculating the influence of gluons, particles that carry the strong force, which binds the proton together. For example, when gluons’ contributions are included, the proton is squeezed to a pressure greater than estimated to exist within incredibly dense neutron stars, she and a coauthor reported in Physical Review Letters in 2019. “It’s a very remarkable calculation,” says physicist Volker Burkert of the Thomas Jefferson National Accelerator Facility in Newport News, Va. “That’s very fundamental, and it’s the first time it has been done.” Because they have no electric charge, gluons tend to elude experimental measurements, and that has left the particles neglected in theoretical calculations as well. Shanahan’s gluon results should be testable at a new particle collider, the Electron-Ion Collider, planned to be built at Brookhaven National Lab in Upton, N.Y. (SN: 4/18/17).
Superpower
Persistence. “I hate not knowing something,” she says. “So if there’s something I don’t understand, I’m extremely stubborn when it comes to figuring out the answer.”
Top tool
A technique called lattice QCD is the foundation for Shanahan’s work. It’s named for quantum chromodynamics, the piece of the standard model that describes the behavior of quarks and gluons. QCD should allow scientists to predict the properties of protons and neutrons from the bottom up, but the theory is incredibly complex, making full calculations impossible to perform even on the best available supercomputers. Lattice QCD is a shortcut. It breaks up space and time into a grid on which particles reside, simplifying calculations. Shanahan is leading efforts to use machine learning to rev up lattice QCD calculations — putting her persistence to good use. “We don’t have to rely on computers getting better. We can have smarter algorithms for exploiting those computers,” she says. She hopes to speed up calculations enough that she can go beyond protons and neutrons, working her way up to the properties of atomic nuclei. — Emily Conover
Back to SN 10 list
How to engineer cellular helpers
Tumblr media
Credit: Caltech
Mikhail Shapiro, 39 Biochemical engineer
Affiliation: Caltech Hometown: Kolomna, Russia Favorite protein: He can’t pick just one
Bold idea
Mikhail Shapiro believes that in the future, “we’re going to have smart biological devices that are roaming our bodies, diagnosing and treating disease” — something akin to the submarine in the 1966 classic sci-fi film Fantastic Voyage. As the shrunken sub entered and repaired the body of a sick scientist, commanders on the outside helped control it. “Similarly, we’re going to want to talk to the cells that we are going to send into the body to treat cancer, or inflammation, or neurological diseases,” Shapiro says.
Shapiro and his colleagues are working on building, watching and controlling such cellular submarines in the real world. Such a deep view inside the body might offer clues to basic science questions, such as how communities of gut bacteria grow, how immune cells migrate through the body or how brains are built cell by cell.
Despite his futuristic visions, Shapiro is often drawn to the past. “I like science history a lot,” he says. Right now, he’s in the middle of rereading the Pulitzer Prize–winning The Making of the Atomic Bomb. Just before that, he read a biography of Marie Curie.
Standout research
“There is not a protein that I learn about that I don’t think about ways to misuse it,” Shapiro says. But he’s especially fond of the proteins that build the outer shell of gas vesicles in certain kinds of bacteria. These microscopic air bags “have so many uses that were totally unanticipated,” Shapiro says.
In addition to letting bacteria sink or float, these bubbles provide a communication system, Shapiro and colleagues have found. Over the last several years, they have coaxed both bacterial cells and human cells to make gas vesicles and have placed such cells within mice. Because the air-filled pockets reflect sound, the engineered cells can be tracked from outside a mouse’s body. Using patterns of sound waves, the researchers can also drive bacterial cells around in lab dishes.
“There is not a protein that I learn about that I don’t think about ways to misuse it.”
Mikhail Shapiro
In another nod to Fantastic Voyage, scientists can weaponize these cellular submarines. “We’ve essentially turned cells into suicide agents triggered by ultrasound,” Shapiro says. This explosion could release chemicals into the surroundings and destroy nearby cells. This sort of targeted detonation could be damaging to tumors, for instance. “Complete warfare is possible,” he says.
By seeing the potential in these esoteric gas vesicles, Shapiro was “ahead of his time and hugely innovative,” says Jason Lewis, a molecular imaging scientist at Memorial Sloan Kettering Cancer Center in New York City. “I think we’ve only scratched the surface of what his work will do in terms of a greater impact.”
Motivation
“Frustration,” Shapiro says, is what made him switch to engineering after studying neuroscience as an undergraduate at Brown University in Providence, R.I. He realized that existing tools for studying processes inside the brain fell short. “And I didn’t see enough people making better tools.”
But he didn’t stop at developing new neuroscience technologies. “Oddly enough, once I got into the engineering part of things, I got so fascinated with weird proteins, and magnetic fields, and sound waves, and all the more physics-y side of things. That’s become as much, if not more, of my passion as the original neuroscience.” In his Twitter bio, Shapiro describes his expertise as succinctly as possible: “Bio-Acousto-Magneto-Neuro-Chemical Engineer at Caltech.” — Laura Sanders
Back to SN 10 list
Regeneration through an engineer’s eyes
Tumblr media
Credit: Stanford Medicine
Bo Wang, 39 Bioengineer
Affiliation: Stanford University Hometown: Nanjing, China Favorite organism: Planarian
Inspiration
Planarians are the most charismatic of all flatworms, Bo Wang says. “They have this childish cuteness that people just love.” But the adorable facade isn’t what drew Wang to study the deceptively simple worms, which resemble little arrows with eyes. It was planarians’ superpower: regeneration. Slice a planarian into pieces and, within a week or two, each chunk will grow into a new flatworm — head and all. Studying the cells that drive this process could offer lessons for turning on regeneration in human tissues, to treat various diseases, regrow limbs and grow organs for next-generation transplants.
Bold idea
Wang uses statistical physics to figure out how planarians regenerate entire organs cell by cell. Newly formed brain cells, for instance, must physically position themselves to avoid turning into “amorphous aggregates,” Wang says. His interest in how things fit together began in graduate school at the University of Illinois at Urbana-Champaign. There, Wang trained as a physicist and worked on self-assembling materials. Wang now works to uncover the physical rules that living cells follow. “I’m fascinated by how molecules arrange themselves seemingly randomly, but there are still statistical rules that those molecules will follow,” he says.
Bo Wang works to uncover the physical rules that living cells follow.
His physics-based approach is raising new questions and unveiling biological processes that would be hard for biologists to come by using traditional methods alone, says regeneration biologist Alejandro Sánchez Alvarado of the Stowers Institute for Medical Research in Kansas City, Mo. Wang is “a new breed” of flatworm biologist, Sánchez Alvarado says. “He is occupying a very unique niche in the community of developmental biology.”
Standout research
Wang and colleagues recently found that nerve cells, or neurons, in regenerating planarian brains form a predictable pattern dictated by the types of cells in their midst. Planarians brains are akin to cities made up of neighborhoods of neurons. Within each neighborhood, no two neurons that do the same job will live next to each other; those cells repulse each other but stay close enough to communicate, the researchers reported in the May Nature Physics. Because of this behavior, increasing the types of neurons in a neighborhood limits the ways cells can pack together. The team dubbed this packing process “chromatic jamming,” after a famous mathematical puzzle called the four-color problem (SN: 3/6/09).
The finding is surprising and challenges “what we think we understand about organogenesis and about organization of cells within an organ,” says Sánchez Alvarado. Chromatic jamming appears to be key to how the planarian brain comes together, guiding single cells into neighborhoods that are a driving force in organ development, he says. If similar physical rules apply to human cells, that could help scientists sketch blueprints for engineering and growing artificial organs. — Cassie Martin
Back to SN 10 list
Cheaters can’t evade this genetic sleuth
Tumblr media
Credit: Stowers Institute for Medical Research
SaraH Zanders, 37 Geneticist
Affiliation: Stowers Institute for Medical Research Hometown: Glenwood, Iowa Favorite organism: Fission yeast
Backstory
An invitation to work in the lab of her genetics professor Robert Malone at the University of Iowa in Iowa City set SaraH Zanders on the path to becoming a scientist. “It was a turning point in my life,” Zanders says. Before that, she didn’t really know how she would put her biology degree to use, or what it meant to be a scientist. In Malone’s lab, she fell in love with meiosis, the process by which organisms divvy up genetic information to pass on to future generations. The first step is julienning the genome and swapping pieces of chromosomes. “That just seems like such a bad idea to basically shred your [DNA] in the process of getting it from one generation to the next,” she says. She started studying the proteins involved in making the cuts. “It was like I was born to do that. I never would have known without that push.”
A different kind of push led Zanders to spell her first name with a capital H: An elementary school teacher kept leaving the letter off. Zanders has capitalized it for emphasis ever since. “If I write it without the big H, it doesn’t look like my name anymore,” she says. “It feels like somebody else.”
Standout research
Meiosis is full of conflict. For her postdoctoral work, Zanders focused on a particular type of dustup caused by some selfish genes — genes that propagate themselves even if it hurts the host. As the monk Gregor Mendel laid out in his study of pea plants, a particular version of a gene typically has a 50-50 chance of being passed on to the next generation. But the selfish genes Zanders was studying, a type called meiotic drivers because they propel themselves during meiosis, manage to get themselves inherited far more often. “These kinds of systems do a complete end run around Mendel’s laws,” says Daniel Barbash, an evolutionary geneticist at Cornell University.
In Schizosaccharomyces pombe, also called fission yeast, Zanders discovered, a family of selfish genes makes moves that would be right at home in a Game of Thrones story line. Zanders and colleagues were the first to work out the molecular tricks that thesegenes use to skirt Mendel’s laws, reporting the findings in eLife in 2017. The genes, known as wtf genes, produce both a poison and an antidote. All of the spores — the yeast’s gametes — get the poison, but only those that inherit certain gene versions also get an antidote. Spores that don’t get the antidote die, ensuring that only offspring with specific wtf gene versions survive to pass their genes on to the next generation. For the fission yeast, such predatory tactics can have big consequences, even driving two nearly identical strains toward becoming different species. Some selfish genes have made themselves essential for proper development (SN: 7/3/18). In humans and other animals, genetic conflicts may lead to infertility.
For the fission yeast, such predatory tactics can have big consequences, even driving two nearly identical strains toward becoming different species.
“This extremely important family of meiotic cheaters has been just sitting in plain sight waiting for somebody who had the right kind of lens and the care … to discover them,” says Harmit Malik, an evolutionary geneticist at the Fred Hutchinson Cancer Research Center in Seattle and Zanders’ postdoctoral mentor. Zanders helped build a case that the skewed inheritance in these yeast was a real effect, not just fluctuations in the data. Before she began her work, virtually nothing was known about meiotic drivers in yeast. Now the wtf genes are among the best known meiotic drivers studied in any lab organism. Some selfish genes in worms also use the poison-antidote trick to beat the competition (SN: 5/11/17). Meiotic drivers in fruit flies, mice — and maybe humans — win genetic conflicts by other means (SN: 10/31/17; SN: 2/24/16).
Motivation
Zanders is now on the lookout for other genetic fights in yeast. Understanding such conflicts more generally may help answer big questions in evolution, as well as shedding light on human infertility. As for what motivates her, “It’s just I like to solve puzzles,” Zanders laughs. “I wish it was a deep desire to help people, but it’s definitely not that.” — Tina Hesman Saey
Back to SN 10 list
Quake expert co-opts underground cables
Tumblr media
Credit: Caltech
Zhongwen Zhan, 33 Seismologist
Affiliation: Caltech Hometown: Jinzhai County, China Favorite hobby: Carpentry
Big goal
As the Rose Parade wound through Pasadena, Calif., on January 1, 2020, Zhongwen Zhan listened to the underground echoes of the marching bands and dancers. With a sensitive technology known as distributed acoustic sensing, or DAS, Zhan tracked the parade’s progress. He even identified the most ground-shaking band. (It was the Southern University and A&M College’s Human Jukebox.)
The study was a small but elegant proof of concept, revealing how DAS is capable of mapping out and distinguishing among small seismic sources that span just a few meters: zigzagging motorcycles, the heavy press of floats on the road, the steady pace of a marching band. But Zhan seeks to use the technology for bigger-picture scientific questions, including developing early warning systems for earthquakes, studying the forces that control the slow slide of glaciers and exploring seismic signals on other worlds.
Zhan has a “crystal-clear vision” of DAS’ scientific possibilities, says Nate Lindsey, a geophysicist at Stanford University who is also part of the small community of researchers exploring the uses of DAS. “When you get such a cool new tool, you like to just apply it to everything,” he adds. But Zhan’s expertise is “very deep, and it goes into many different areas. He knows what’s important.”
So far, Zhan and other researchers have used the technology to study aftershocks following the 2019 Ridgecrest earthquakes in Southern California (SN: 7/12/19), to demonstrate that interactions between ocean waves produce tiny quakes beneath the North Sea, and to examine the structure of glaciers.
Top tool
DAS piggybacks off the millions of fiber-optic cables that run beneath the ground, ferrying data for internet service, phones and televisions (SN: 6/14/18). Not all of the glass cables are in use all of the time, and these strands of “dark fiber” can be temporarily repurposed as seismic sensors. When pulses of light are fired into the fibers’ ends, defects in the glass reflect the light back to its source. As vibrations within the Earth shift and stretch the fibers, a pulse’s travel time also shifts.
Whole networks of seismic sensors could be deployed in places currently difficult or impossible to monitor — at the ocean bottom, atop Antarctic glaciers, on other planets.
Over the last few years, scientists have begun testing the effectiveness of these dark fibers as inexpensive, dense seismic arrays — which researchers call DAS — to help monitor earthquakes and create fine-scale images of the subsurface. In these settings, Zhan notes, DAS is proving to be a very useful supplement to existing seismograph networks. But the potential is far greater. Whole networks of sensors could be deployed in places currently difficult or impossible to monitor — at the bottom of the ocean, atop Antarctic glaciers, on other planets. “Seismology is a very observation-based field, so a seismic network is a fundamental tool,” he says.
Inspiration
“I’ve been interested in science since I was young, but wasn’t sure what kind of science I wanted to do,” Zhan says. In China, students usually have to decide on a field before they go to college, he adds, but “I was fortunate.” At age 15, Zhan was admitted to a special class for younger kids within the University of Science and Technology of China in Hefei. The program allowed him to try out different research fields. A nature lover, Zhan gravitated toward the earth sciences. “Environmental science, chemistry, atmospheric science — I tried all of them.”
Then, in late 2004, a magnitude 9.1 earthquake ruptured the seafloor under the Indian Ocean, spawning deadly tsunamis (SN: 1/5/05). After hearing from a researcher studying the quake, Zhan knew he wanted to study seismology. “I was amazed by how seismologists can study very remote things by monitoring vibrations in the Earth,” Zhan says. The data “are just wiggles, complicated wiggles,” but so much info can be extracted. “And when we do it fast, it can provide a lot of benefit to society.” — Carolyn Gramling
Back to SN 10 list
from Tips By Frank https://www.sciencenews.org/article/sn-10-scientists-to-watch-2020
0 notes
Photo
Tumblr media
How Humans Make the Earth Their Home 
  Beginning in 2012, and for many summers ever since, my team and I have been helicoptering onto the Greenland ice sheet, in this fantastical melt zone. We use helicopters to string cableways over the top of rushing super glacial rivers so that we can hang this river discharge measurement technology called Acoustic Doppler Current Profiler (ADCP). We operate around the clock to collect measurements of river discharge every hour, for up to a week in duration. We have collected the world's first meltwater runoff measurements on top of the ice sheet. What we then do is simultaneously use drones and satellites to map out the upstream contributing watershed area flowing to that point where we are collecting the discharge measurements. When we know the contributing watershed area and we have the flow measurements at the bottom of the watershed, we then have a completely independent field dataset from which we can test the ability of climate models to simulate meltwater runoff from the Greenland ice sheet. And it's those models that are being used to predict the future. It's those models that are being used to estimate projected ranges of sea level rise in IPCC reports and so forth.LAURENCE C. SMITH is the John Atwater and Diana Nelson University Professor of Environmental Studies and Professor of Earth, Environmental and Planetary Sciences at Brown University. He is the author, most recently, of Rivers of Power. Laurence C. Smith's Edge Bio PageHOW HUMANS MAKE THE EARTH THEIR HOMEAs an earth and climate scientist, I think a lot about the physical Earth and how it's changing, especially from climate change, but also due to human environmental impacts. As a geographer, I think a lot about how humans make the Earth their home.If you look at the history of humanity, there's a fascinating progression of a seeming separation from nature that goes back 10,000 years and is still happening in an accelerating way today. The earliest civilizations, which got going around 10,000 years ago (their origins even earlier than that), formed along the river banks of the Tigris-Euphrates Rivers and present-day Iraq. Even before that, they formed along the Indus River in present-day Pakistan and India and the Yellow River in China. A little later, even here in North America, pre-colonial discovery, they formed along the banks of the Mississippi River, with a long-forgotten Native American civilization called the Cahokians. The Cahokians had earthen and log pyramids; they had a vast empire spreading up and down the Mississippi River Valley; and they had trade and administrative centers. These hydraulic civilizations along rivers created some of the first big societal inventions—the creation of the city-state, the creation of a ruling class, the creation of some of our earliest human institutions, like engineering, science, and law.The first big step was the removal of our dependence on foraging for food, and being able to survive in fixed settlements due to irrigated agriculture. From there, we moved to trade, we discovered machines, the Industrial Revolution happened, and from there we moved to global food production. Our most recent trend, which has been going on for well over a century, is urbanization—this incredible movement of people from the countryside to urban cores. In fact, a little over a decade ago, in 2008, a remarkable threshold was surpassed with regard to this phenomenon. Somewhere on Earth, a baby was born—we'll never know exactly where, or who it was, or exactly when it happened—and for the first time in the history of the human species, we became urban in the majority. Never before had this happened. By that time, a decade plus ago, well over half had long since forgotten how to feed themselves, how to obtain water, how to get clothing. We long ago ceded these most basic of biological requirements to our organized society. Today, in the modern world, we rely very much on the efforts of global supply chains and multinational corporations to provide our basic biological services to us, the urban consumer.I recently moved to New England from Los Angeles. In LA there's a very popular food chain called Trader Joe's, which many of you know. Trader Joe's is super popular with Angelinos like myself. It espouses fair trade principles, fair trade coffee, great labor practices, and good benefits for their employees. Many of my fellow Angelinos love to shop at Trader Joe's and gasp in horror at a Walmart, which has a different philosophy with regard to its labor practices and so on. The difference between these two companies is far less than they would appear, certainly to my neighbors. Both of them are reaching out around the planet to obtain goods and food items as cheaply as possible and route them via tightly scheduled global supply chains to us. They're doing the exact same thing in terms of the business model. And this model, which has been going since the 1970s, is but the latest chapter in humankind's ongoing redefinition of its relationship with the physical Earth.It began with our move from a hunter-gatherer foraging species over 10,000 years ago to settled communities. From there, we consolidated more and more in cities, and we invented machines to do our work for us. This accelerated the departure of people from landscapes. We went from an agricultural civilization to an urban one. In the US, less than 2% of the population is engaged in agriculture today. The fields are devoid of humans and mowed by autonomous, heavy machinery guided by GPS. The advent of trade, first locally and then globally, has redistributed the way materials are circulated around the globe. Our move to a global food model only accelerated that. Now, the trend has been very much to concentrate in urban cores. While the world population is still growing, rural areas are, depending on where you look, depopulating and losing people. Depending on what happens with the land that's left behind, this can sometimes be good for nature, with rewilding of forest and the return of species that have been extirpated there in the past.A wonderful example of this is all around me here in New England. This is a landscape that was at first dammed up everywhere—every little river and stream of any substance was damned to make mill races for the Industrial Revolution. The forests were cut down; the whole area was covered with hardscrabble farms and pasture; beavers, fish, and other forms of wildlife were extirpated. Now, in the year 2020, the place is regrown with thick forests. There are beavers all over the place. While highly suburban, most of the human activities are taking place in the city with a much milder impact upon the surrounding local environment than has happened in this area for hundreds of years.Of course, the environmental impacts of that urban activity are now global, with climate change. This is a topic I spend most of my time on for my day job. I'm an Arctic climate change scientist, so I do a lot of work in Greenland, Siberia, northern Alaska, and so on. Looking ahead, I wonder a lot about how this relationship between humans and the physical Earth will continue to change. I wonder if we will burn the remaining reserves of fossil fuel, which is the track we're currently on, or if we will leave it in the ground. I wonder if we will defend our coasts from sea level rise or if we will retreat from them. If we do retreat, what will be the nature of that retreat? Will it happen in an orderly way or will it take major storm surge events, like Superstorm Sandy, to render areas suddenly uneconomic or unbuildable, as happened with Hurricane Katrina along many miles of the Gulf Coast? I wonder if we will geoengineer climate, for which there are a number of proposals. Just in the last ten years, even among earth scientists like myself, this idea has gone from being an outlandish concept to one that is being taken with increasing gravitas as the national and international level efforts to slow the pace of climate change continue to fumble and falter.I wonder if humans will take control (or partial control) of evolutionary processes and begin to direct evolution through synthetic biology. My own interest in synthetic biology extends especially to the modification of plants for human food, potentially the modification of plants as a mitigator of climate change through carbon sequestration. I wonder what the implications of this may be. I am astounded to think of what kinds of life experiences my three small kids will encounter as they go through the next eighty to ninety years, particularly in this area of synthetic biology. I speculate that the changes they will see in their generation will be even more dramatic than the ones I've seen through the information revolution of the last forty or fifty years.I wonder if we will populate the Arctic. This is a place that is continuing to warm, not that it'll ever be warm in the wintertime, of course, because it's the Arctic—the polar night will always return. But it is also a place that has a great deal of fossil fuel energy that is not currently developed. Should we stay on the path of fossil fuel development, it is an area that is likely to be accessed and used for this purpose. We're already seeing that in places like northern Siberia, the North Slope of Alaska, the oil sands, or tar sands, depending on your point of view of northern Alberta, Canada. I wonder if our long, lurching trend of global integration, which has its roots in the wreckage of post-World War II, beginning with the Bretton Woods Agreement, will stop and will reverse. Certainly, developments in recent years have suggested this is possible. Without getting into the politics of it, Brexit is a prime example. There are many indications around the world that this ideology is perhaps changing and that it is quite plausible we could deglobalize moving forward.I wonder if the current Covid-19 pandemic will either deter or inspire the use of bio warfare. Bio warfare has long been the most untouchable and covert form of warfare. It's an old idea, of course, but it has always been considered untouchable by national actors. And I would hope it would continue to be. It is fascinating to wonder, as the Covid-19 pandemic continues—which is the biggest experiment of its kind unleashed on civilization since 1918—to what extent it could inspire bio warfare in the future, particularly along non-state actors such as terrorist groups.Finally, looking even more broadly, I wonder whether we might in future decades consider dispersing life off of Earth. And if so, what kinds of lessons might be learned from the dispersion of life on this Earth by human hands. Those are just a few of the things I've been thinking about with all my time off.For over twenty-five years now, my climate change research has drawn me north to the Arctic and Subarctic. These are the ice sheets and glaciers and permafrost lands and seas of Alaska, northern Canada, Iceland, Greenland, Siberia, the northernmost Nordic states. The reason my graduate students and I spend so much time working up there conducting field studies and satellite remote sensing is because the Arctic is one of the most rapidly transforming regions on Earth, both from amplification of global climate change as well as human pressures that are unique and operating within the region itself.Let's start with the first one. The Arctic, unlike Antarctica, experiences greatly amplified increases in climate warming that are larger than the global mean temperature increase. Whenever the planet warms up by a degree on average, that one degree average is masking some profoundly different geographical contrasts around the Earth, with the Arctic experiencing well over double the global mean average rate. The reasons for this are not mysterious. We understand them quite well. There are a number of geophysical feedbacks that operate within the region that are present in the Arctic but not in Antarctica, for example. The biggest one, and most famous of these natural feedbacks, is the geography of the region.Unlike Antarctica, which is a continent buried under over a mile of ice that is over a million years old, the geography of the Arctic consists of an ocean surrounded by massive amounts of land, with the largest landmass on the planet found in the northern high latitudes. That's where two of our biggest countries are—Russia and Canada. The fact that there is an ocean there surrounded by land, rather than a continent surrounded by ocean, gives rise to what's called the ice-albedo feedback. Meaning, the way it operates is the albedo, or reflectivity, of a dark ocean is much lower than that of ice and snow, which is highly reflective. The ocean always freezes in winter, and hopefully always will, because the polar regions will always have polar night in the wintertime. So the ice, no matter how warm it gets in the summer, will always refreeze in the winter. The Arctic Ocean freezes over in the wintertime and begins to melt back from its edges during the summertime, reaching its minimum seasonal extent in late September. The extent of that minimal summertime sea ice cover in the Arctic Ocean has been steadily declining ever since NASA first started mapping it using passive microwave satellites in the late 1970s. The maximum seasonal extent of ice cover in the Arctic Ocean has declined by over 40% since the late 1970s. This causes the reflectivity of the surface of the planet up there to decrease, because as that ice melts back, it exposes the darker ocean, which absorbs more of the incoming solar light. So rather than being reflected back to space, that energy is absorbed by the ocean, which warms the ocean water and retains heat within the system to be re-released throughout the year.This is one of several feedbacks operating in the Arctic, which are unique to the Arctic and cause it to experience amplified climate change. Importantly, this feedback works in the opposite direction as well. So when the Earth enters periods of global cooling, such as during Ice Age oscillations, when global temperatures decrease, the Arctic's will plunge more than double the global average. It is a highly amplified system.From a human presence and development point of view, the Arctic and Antarctica couldn't be more different. The Arctic landmasses are under the political control of eight sovereign nations. They belong to those countries, namely, the United States, Canada, Greenland, which is an autonomous region of Denmark, Iceland, Norway, Sweden, Finland, and Russia. Antarctica, in contrast, is governed by an international treaty and is shared among nations. The entry key to participation is the expenditure of funds on science and maintaining science camps in Antarctica.Antarctica is a fascinating place because many countries want to be involved with it for geopolitical reasons, but the only allowable way in which this can manifest is through the funding of science in Antarctica. As a scientist, it's absolutely wonderful. The governance situation couldn't be more different. The Arctic, being comprised of sovereign nations, is governed by the rules of those sovereign nations, with offshores governed by a very important piece of international law called the United Nations Convention on the Law of the Sea, which governs the offshore exclusive economic zones of all signatory nations (200 nautical miles). And through a provision within this piece of international law, called Article 76, signatory nations can petition to extend their offshore sovereignty through scientific mapping and geological studies if they can establish that the sea floor is a natural extension of their continental shelf.It just so happens, due to the geography of the Arctic Ocean—it's a fairly small ocean and very shallow—most, if not virtually all, of the Arctic Ocean either already has been or soon will be partitioned between five Arctic countries that abut onto the Arctic Ocean. Perhaps even the North Pole itself. The biggest beneficiaries of this are Russia, Canada, Greenland, the United States, Sweden, and Norway. When you couple that governance structure with the extensive potential reserves of natural gas, condensate, and oil in this region, the development prospects for this remote iconic part of the world are huge over the long term, not in the short term.The Arctic is a very difficult place to operate—very expensive, very environmentally fraught. Many national and multinational oil companies have attempted to operate in the Arctic and have failed or have withdrawn. But over the long term, should current trends continue as they are now (and I hope they don't), there is little question that pressure to develop the fossil fuel reserves of the Arctic and subarctic will be immense. In fact, it has already begun in western Siberia, where the decades-long oil and gas development of western Siberia by the Russian Federation beginning in the sixties, when it was the USSR, have now expanded North to the Yamal Peninsula, where in 2017, the Putin administration opened up a liquefied natural gas (LNG) plant at Sabetta, on the coast of the Yamal Peninsula. So, it has already begun there. And in North America, of course, we have the ongoing bumpy development of the oil sands or tar sands.The climate changes happening in the Arctic are more extreme than the climate changes even in Antarctica, which is also undergoing tremendous warming, and the development potential, pressures, and governance are totally different between the Arctic and Antarctica. This is why we see two asymmetric spheres of activity and interest taking place in these two very important and iconic regions. I will hasten to say, however, that the single greatest uncertainty in global sea level rise risk does come from Antarctica, which is why it is under acute scientific scrutiny and study—because a large fraction of that ice mass is grounded on the continent below sea level rise. It is also ringed by large, thick ice shelves that buttress glacial ice up on the land. If those ice shelves disintegrate, which they're prone to do when ocean water is warm, their ability to buttress the land ice up on top of the continent goes away. If you remove the plug at the bottom, then the land ice can flow right down to the ocean. This is quite a frightening possibility. We don't understand the processes involved very well. Our ice dynamics models are not well calibrated because we don't understand all of the tensile physics that are found along the interaction of the glacial bed with the overlying ice. The interaction of those physics with the buttressing effect and ocean temperatures at the ice edge are present in the models, but our models are not calibrated well enough to be able to know if and when this might happen. It creates a big error bar in our uncertainty about how fast and how high sea level may rise by the end of this century.This collective body of wisdom gets summarized every few years with the United Nations' IPCC reports, which are notoriously conservative and slow in their assessments. That's because they are consensus documents that have to be signed off by governments. The most recent IPCC release that came out in September has significantly upgraded their risk potential for sea level rise to be as high as anywhere from 30 centimeters to 1.1 meters by the end of this century. And again, it must be kept in mind that this is a conservative document, and the reason for such a large range comes from uncertainty surrounding the stability of the West Antarctic ice sheet.Some of the most exciting research I'm doing—together with my graduate students, postdocs, and collaborators—is improving our ability to model and predict how much sea level rise will come out of the Greenland ice sheet. The next time you look at a world map or globe, have a look at the Greenland ice sheet and look at its latitude. You should be struck by the oddity of it even existing at all. Here we have this massive ice sheet coming to quite low latitude sticking up above everything else. It's quite out of place. Unlike, say, the Antarctic ice sheet, which is perched right at the bottom of the Earth on the South Pole, the Greenland ice sheet is a relic of the last Ice Age. It survives only by virtue of its own elevation. What I mean by that is if you were to somehow magically pluck the Greenland ice sheet off the bedrock, it would not reform and grow. It just would not come back. The reason it persists at all is because it's already there and, therefore, at its higher elevations at the summit, the elevation is sufficiently high that the temperatures do not go above freezing during the summertime. Just like when you go up in an airplane and the temperatures get cold. There's a strong decrease in air temperature as you move up through the troposphere. This is what allows the Greenland ice sheet to exist.Now, it does melt extensively around its lower elevations along the margins. This is the area that is ground zero for Greenland's contribution of meltwater runoff to the ocean, and ground zero for the single biggest component of sea level rise contribution from Greenland to the rest of the world. A little present from Greenland, if you will. The extent and intensity of this melt zone around the edges of Greenland have been dramatically increasing pretty much every year. Quite steadily. Some years are cooler, some years are warmer, but the overall secular trend is one of increasing melt and mass transfer from the Greenland ice sheet to the ocean, where it contributes about a third of global sea level rise at this time—over a millimeter per year is coming just from the Greenland ice sheet.My team and I have been pushing this science forward by studying this melt zone. Surprisingly, until quite recently, this area received light scientific study. Most ice sheet scientists and field expeditions focus on drilling camps up at the top in the cold zone, where since it doesn't melt, the ice core record is preserved. These are amazing places to drill down into the ice and pull out a two-mile-long time machine of past climatic changes over geological history. The melt zone along the edge is a real mess. It melts every summer. Just imagine a vast slip-and-slide water park, where it snows over in wintertime, but come June, the whole place starts to melt. It's crisscrossed everywhere with torrential flowing blue streams and trickles that come together into streams, which come together into these great, branching arterial river networks flowing over the top of the Greenland ice sheet. Every single one of these torrential blue rivers in southwest Greenland, which is the most intensive area where this melting is taking place, travels for a few kilometers over the ice before breaching through the ice into a sinkhole called a moulin. From there, it tunnels under the ice and out to the ice edge where it goes right into the ocean, thus contributing to global sea level rise. This is where 90% of the runoff from the Greenland ice sheet to the ocean is coming from.Modeling this melting process and sea level rise contribution process correctly is imperative if we are to be able to forecast with reasonable accuracy the anticipated rate of sea level rise in the future. Amazingly, until the work of our group, no testing or validation of the climate models that operate for this purpose had ever been conducted for Greenland before.Beginning in 2012, and for many summers ever since, my team and I have been helicoptering onto the Greenland ice sheet, in this fantastical melt zone. We use helicopters to string cableways over the top of rushing super glacial rivers so that we can hang this river discharge measurement technology called Acoustic Doppler Current Profiler (ADCP). We operate around the clock to collect measurements of river discharge every hour, for up to a week in duration. We have collected the world's first meltwater runoff measurements on top of the ice sheet. What we then do is simultaneously use drones and satellites to map out the upstream contributing watershed area flowing to that point where we are collecting the discharge measurements. When we know the contributing watershed area and we have the flow measurements at the bottom of the watershed, we then have a completely independent field dataset from which we can test the ability of climate models to simulate meltwater runoff from the Greenland ice sheet. And it's those models that are being used to predict the future. It's those models that are being used to estimate projected ranges of sea level rise in IPCC reports and so forth.The fieldwork is incredible. We pitch tents on top of the melting ice. There's water running everywhere. The tents need to be re-pitched every couple of days because the ice surface is melting all around us. The entire camp takes safety precautions very seriously. If anyone were to slip into one of these streams, they would be washed away. It's about zero degrees Celsius. They are just fast flowing chutes that have melted into the ice. There are no rocks, no trees, no branches—nothing to grab onto. And as I said before, every single one of these streams and rivers flows into a moulin, which then thunders its way down to the base of the ice sheet. So if anyone were to slip and fall into one of these things, they would not be recovered. It's quite possible to work very safely in this environment, but there are many precautions we must take.The first thing we do is encircle the entire campsite with yellow caution tape, so no one is permitted to leave that small area of ice unless they are tethered to a rope. And this is a flat area; it's not mountaineering. The place is just flat and crisscrossed by streams. Nonetheless, when someone goes outside that camp perimeter, they wear mountaineering harnesses and latch into a rope, which a partner then feeds out line from. The scientist goes to the edge of the river where the cableway and the ADCP are and collects the measurements, towing the instrument back and forth across the river while someone else is making sure the rope is never long enough for them to even possibly fall into the river in the first place.We've been collecting these types of measurements for several years, and the findings are just now coming out. I'll tell you some of the preliminary results. I'm working on one of the papers right now. There's good news and bad news. The bad news is that virtually all of the meltwater generated by melting across this area of southwest Greenland, unfortunately, escapes the ice sheet to the ocean. We were hoping to find that, yes, the ice melts, but the meltwater would be somehow retained up on top of the ice sheet. Maybe it would collect in little lakes and pools and just refreeze. Maybe it would soak into the cracks of the ice and refreeze and stay there. It would be wonderful from a sea level rise point of view if the meltwater that is produced by warmer temperatures over the ice would somehow remain, or be trapped, or refrozen within the ice sheet itself. In such a case, we might expect the ice sheet to lower, but also become denser and not lose all of the mass to the ocean. Unfortunately, our satellite mapping and measurements confirmed that this is not the case, and that the surface of the Greenland ice sheet is amazingly efficient at losing all of the meltwater that forms on its surface through these networks of hundreds of rivers flowing over the ice. That's the bad news.The good news is that in study after study, we have shown that the current generation of climate models that are used in this area are, almost without exception, overestimating the amount of meltwater that is produced, by anywhere from 10% to as high as 50%. That's good news in the sense that some of the dire risk predictions of runoff contributions to sea level rise from Greenland are a little too high. We're digging right now into the reasons for this. One possibility is that the models are overestimating for reasons pertaining to energy balance. The other more likely reason is that some of the water is indeed being trapped within a rotten zone—rotten ice on top of the ice surface. It's a very fragmented, porous ice that appears to be storing and retaining some of the water. But finding that missing fraction of overestimation is what we're working on now. More generally, the take home message is that it is both exciting and scientifically critical to get boots on the ground to test and verify these climate models, which are becoming increasingly critical tools for us to project and to plan for the future.* * * * I had always been interested in snow, ice, and rivers, in no small part due to my dad, who is a very highly respected earth scientist. We would spend our summers in the Canadian Rockies, visiting his field sites. He was interested in the meltwater streams and rivers draining glaciers in the Canadian Rockies. In the summertime, we would pack up the station wagon with my brother, our dog, my mom and dad, and drive from downtown Chicago—where I lived a very urban, gritty downtown life—up to the Canadian Rockies, where we would live in cabins for a couple months out of the year doing scientific fieldwork of all these beautiful meltwater streams, lakes, and rivers in the Canadian Rockies.After a one-year stint with the US Geological Survey, after my master's degree, when I rediscovered that I liked school even better than the regular hours of a nine to five job, I applied to Cornell University's Earth Science Department and began a PhD there. It was at Cornell and the Earth Sciences Department where my real transformation as a scientist began.First there was the access at an institution like Cornell. The access to big minds and thinkers becomes an everyday occurrence through, for example, guest lectures and colloquia from eminent visiting scientists. My second year at Cornell, I had the privilege to meet Richard Alley, a world-famous glaciologist at Penn State University who came to give a talk at Cornell and inspired me. He was speaking about his latest work, which had just come out in Nature. He would use ice cores from the Greenland ice sheet to show how North Atlantic climate, at the end of the Younger Dryas cold snap, which was a brief relapse into Ice Age temperatures that happened just as the world was pulling out of the last Ice Age—it was warming up and, almost instantly, the ice core record showed that the temperatures plunged back into cold Ice Age temperatures, stayed cold for while, then warmed, then cooled, then jumped out and continued on its warming trend. And so the temperature swings were dramatic, on the order of several degrees in a couple of years, perhaps even within a single year. It was absolutely astounding work. Richard Alley was the lead author of that work and a very famous person.He came to Cornell and gave a lecture on it. And then, as is customary with these types of academic visits, there was a sign-up sheet to meet with him. Even lowly graduate students such as myself were permitted to spend thirty minutes with him, talking about whatever. So I nabbed one of those spots with Richard and was very much looking forward to hearing him talk about himself, and his discovery, and how he got there. To my amazement, he didn't want to talk about any of that. He insisted upon talking about me and my projects and what was I working on. At that moment, I was working on something which was very hot called wavelet transforms. It's a mathematical filter for time series analysis. I was almost embarrassed to talk to him about it, but he insisted, and before I knew it, we had run over time. We just had this very stimulating, exciting conversation.What I learned from that encounter was how many of our greatest scientists are just perpetually curious. It doesn't matter who they speak to. They're always hungry to learn, always excited to hear new ideas, and interested in engaging and discussing them more than talking about their accomplishments, which was just fantastic.The other formative experience I had at Cornell was running into the culture of that Earth Science Department, which at that time was populated by some towering giants in the field of tectonics, including my own advisor, Bryan Isacks. Bryan and his advisor, Jack Oliver, at Lamont-Doherty Earth Observatory, discovered plate tectonics in the 1960s. When I say "discovered" plate tectonics, what I mean is that Bryan, then a graduate student, and his advisor assembled the datasets and finally pulled together the pieces of the puzzle that revealed the existence of plate tectonics on Earth. The main datasets they looked at were magnetic reversal stripes, which have matching sets on either side of spreading ridges in the ocean and the epicenters of earthquakes.The story, as I heard it told, was that Bryan Isacks and his advisor were standing in the advisor's office at Lamont-Doherty, sketching on a blackboard with little Xs the locations of the epicenters of earthquakes near what we now know to be a subduction zone, where one piece of tectonic crust is plunging beneath another. As Bryan and Jack were puzzling and musing over these, Bryan suddenly blurted out, "It's going down. The crosses are going down. It's [meaning the tectonic plate] getting dragged beneath the other." At that moment, it all finally became clear that the Earth's crust was being born at the spreading center ridges in the oceans, and the magnetic reversals were capturing that slow growth over time. And then they were being subsumed and plunging on the other end in the death zone where one plate was subducting beneath another. They arrived at this idea mutually and flipped a coin over who would be lead author on the paper. Bryan won and he got to be lead author on the paper that put together for the first time the theory of plate tectonics, which launched a revolution in the earth sciences that is still in the mop-up phase today.The benefit to me from working at Cornell was that because of this legacy with Bryan Isacks and Jack Oliver, who moved to Cornell University to the Earth Science Department, they brought with them a whole cohort of big thinkers on plate tectonics who, for the first time, were looking at the Earth at the broadest possible scale to get a big-picture understanding of the Earth system. By its very nature, plate tectonics demands that kind of perspective.When I arrived there in the 1990s, I brought with me a very small field-oriented mindset. I wanted to study a small watershed and understand all these detailed processes, and my advisor and committee pretty much beat me up over that and said, "This is Cornell University—we think big. You have to broaden yourself out." So that was the beginning of my transformation from thinking very small about the earth sciences to very broad scale theoretical and observational study of the Earth at the largest possible scales. This is one reason why my research spans everything from broad Arctic change, to societies, and to humans and the way we make the Earth our home.The other very influential person on my career was a colleague at UCLA in the Geography Department named Jared Diamond. Jared Diamond was enormously influential on me.My first or second year at UCLA as an assistant professor in the Geography Department, I was put in charge of the moribund colloquium program, which had few speakers and no one would attend the talks. I was told I had to revive it and that there was no money to bring in speakers. I had a budget of zero dollars. I said, all right, UCLA is a pretty good school, and there are some good speakers here. Perhaps we could invite them to the Geography Department to speak to us. I had heard of Louis Ignarro, who had just won a Nobel Prize in medicine, and I had heard of Jared Diamond, who at that time was in the medical school at UCLA. So I rang up both of their offices. A few days later, the administrative assistant for Louis Ignarro called me back and told me, yes, Dr. Ignarro would be delighted to speak to the Geography Department, for a $10,000 speaking fee. Around the same time, Jared Diamond called me back personally and said he would be delighted to speak to the Geography Department, so we scheduled him and a few weeks or months later he came and gave a talk based on Guns, Germs, and Steel, which had either just come out or was just about to come out. Many of the ideas in that book are highly geographical. The talk was great. The faculty hung around for what is probably the longest Q and A session I have ever witnessed—it went for about forty-five minutes. There was a fantastic back-and-forth between Jared and my colleagues.A few months later, we got a query from Jared about the possibility of transferring to the Geography Department at UCLA. I was not part of those discussions, but I like to take some credit for enticing Jared to geography at UCLA and leaving the medical school. Jared left me alone until I got tenure and I established my core research working with field studies and remote sensing in west Siberia, Iceland, Alaska. After I received tenure, he really encouraged me to think a little more broadly, even beyond the hard earth sciences. I had read Jared's books and was very inspired and excited by his synthetic way of thinking. I value the way he looks across different disciplines and tries to find common threads and pull them together. He does not hesitate to learn from other fields even if they're a bit far from his core academic research. That has inspired some of my own efforts and writing outside of my core science. Those two individuals were extremely influential on me along this academic and intellectual career that I'm still on. I look forward to meeting the next person.(via lsmith21_photo_.jpg (1312×868))
0 notes
allanamayer · 4 years
Text
Surveying a community on the idea of a digital archive
The last thing I want to share from this Oakville Arts Council project is the survey data we collected near the end of the study. We opened this survey up to the wider community so not every respondent had been fully briefed or consulted about their own individual participation - and while some responses reflect this, the yes/no questions are still very much in favour of the project. I think these kinds of responses are fully replicable in other areas, whether or not you do as much extensive consultation as we did (we met with 50+ individuals representing 20+ community groups) or have other outreach and education components such as the webinars we did.
Here’s the text straight from the study report, and I’ll attach the survey instrument too so you can see what we asked. Here it is in a PDF.
***
Of our final dataset, 97.73% were supportive of the project, 100% felt the Oakville Arts Council was organizationally capable of project success, and more than 20 respondents gave us ideas about how they would contribute to the archive or use it once it was built. 
97.73% would like to see more documentation of Oakville’s art history.
Responses include:
“I think we have a very vibrant and active artistic community and it definitely should be documented.”
“There is a rich history and also many newcomers to town who may not little of that history.”
“Many people have no idea of the wide variety of arts in which they can learn or participate even when they have lived here for several years. Newcomers of course have even less idea what is available.”
“Oakville has a rich history which will be lost if it is not properly documented.”
“No historical account of a community would be complete without the history of the arts in the community as the arts paint a rich picture of the culture and the soul of a community.”
“It is so important for new generations to connect with the history of their city, and how it relates to their community.”
“It is good to have a reliable site to research arts and artists - knowing more about both gives a better understanding of the work and can be encouraging to other artists to know more about an artist's successes and hardships overcome.”
“History is critical to helping current and future Oakville residents feel a greater connection to, and affiliation with, their own art, with Oakville, and with the community.”
97.73% would like to see an online archive of materials related to Oakville's art history.
Responses indicated an interest in both an online collection and a physical collection:
“It is good to have such an archive in one place - online research is not always easy and if there is a specific Oakville art site, then it makes research or just looking for information, more accessible.”
“Both online and in print should be promoted. Younger people are accustomed to finding about everything online and some older generations prefer finding information in print.”
“Accessible at all hours and to all. However, I also am concerned about the difficulty of keeping it up-to-date as digital technology evolves.”
“Not only should there be an online archive but original materials should be collected and properly stored for future researchers.”
100% think the Oakville Arts Council is the right organization to pursue this project.
Responses include:
“They are a trusted supporter of the arts locally.”
“Who better?”
“It is the one voice for the arts in Oakville.”
“They have the staff and facility to organize and amass the large amount of information which will need to be categorized and presented.”
“Experience and the personnel to complete it.”
“The OAC has access to and a relationship with a variety of artists and therefore is well placed to access the information.”
“I look to them for general information. They already demonstrate skill online and have a recognized presence for the arts.”
“Projects need a leader, but this needs to be a joint effort involving as many stakeholders as necessary to ensure the final product covers all the bases.”
95.45% agree that this project will help more people connect with culture, heritage, and the arts.
Responses include:
“Digital dispersal of information is practical and appealing to most of the population. It is an excellent method of publicizing upcoming arts events and also retaining the history of past performances of all kinds, usually only briefly covered in the local press. It will require great diligence to keep it going properly.”
“During physical distancing, residents can still engage with the arts.”
“If presented in a concise, exciting way I think it would be a wonderful interesting resource.”
“It would be an important reference source for future artistic projects to be aware of what has been done before, and the Oakville people who initiated and sustained artistic endeavours here in Oakville.”
“It will allow those interested in the various arts to find out who have gone before and the heritage of this town and its culture and be a link between the past, present and future.”
97.73% agree that this project will benefit artists and arts organizations; 97.73% agree that it will benefit the community of Oakville.
Responses include:
“It may give newcomers a point of connection not only with the community as a whole but also lead people to the arts organization they might be interested in exploring.”
“Could enhance collaborative efforts.”
“Information is vital for getting new members involved.”
“We learn valuable lessons from those who have a history in the arts.”
“Can provide inspiration and a forum for collaboration.”
“More involvement in community events, potentially more volunteers and donors. Possibility of children getting interested in the history of their particular arts form here in town.”
“It may help those who are thinking of moving to Oakville an insight of what we feel is important to us and the preservation of our cultural history.”
“There is such an influx of new residents, it is important for them to know and recognize the people who have helped to created a vibrant society in the past.”
“It will bring people together socially.”
“Curated materials add to content that can be posted online, used in the classroom, whether online or not, and add to our stories.”
95.45% agree that this project is needed now.
Responses point to the coronavirus pandemic as a specific timely reason:
“As time marches on resources that are hidden away in boxes and basements will disappear forever.”
“The arts are needed more than ever during Covid-19.”
“Pertinent documents may be lost if not gathered soon. I am over 80, and if I can no longer be active, my reference material will be discarded.”
“Much of the past is disappearing and needs to be archived before it is too late.”
“If we do not have something like this already, then why not now? The longer we wait the more gets lost as people of previous generations with valuable input and information may no longer be around.”
“This will take time and the sooner you start the better. There are many who would say we will need all our money and help to rebuild the economy for those who have lost work and businesses and this is true, but we also need to keep in mind those things that enrich our souls, our inner spirits and give us joy - the kind of joy we can get from a fine painting or a beautiful song or performance by someone who has been gifted and nurtured here in our own Town.”
“In terms of self isolation - provides more connection for the community (this would be a very immediate need).”
“This question couldn’t come at a better time - we’re housebound, and having curated materials posted online allow for learning experiences we can’t otherwise have.”
“You don’t want history to fall through the cracks.”
92.86% of eligible respondents would be willing to participate in some way.
Responses include organizations we have already met with, as well as individuals:
“I have much of the above material which needs to be saved.”
“Because I was born here and have been involved with the artists of Oakville for almost a century.”
“Yes, if it moves forward I would think that I/we would participate, as we participate in any opportunity that may give us some visibility, however small.”
“As I continue to make art, teach art and write about art, I hope to contribute what I have been privileged to learn. Being born here 91 years ago and I can still remember, there are stories that I can contribute.”
“I’m not sure how, but I’d love to.”
“I’m always available to help the arts.”
2 responses indicated they felt they were ineligible or had nothing to contribute, but as the project components are still undetermined, there may be broader or more strict eligibility in the future.
We asked people to tell us in their own words how they might be able to contribute:
“I would provide audio and video recordings, and historical documents, about the groups I work with.”
“[Our group] has existed in this town for almost 60 years and we have material that could be digitized covering most of those years.”
“I have items of historical importance for [a group].”
“Photographs, newspaper articles, audio and video recordings”
“I would submit paintings and animation, and reference my writing and books.”
“submit videos, student compositions.”
“I’d happily work/volunteer/promote the collection. I have memories of taking pottery courses in the little building at Coronation Park as a child and have the horrid creations still. The experience meant something to me.”
“I would like to contribute memories and stories. I was part of community theatre for many years.”
We then asked people to tell us in their own words how they would use the archive:
“As a history buff, I would consult it to find out about artists of the past.”
“Referring to it for information on past and present artists and their works and new ideas of what constitutes art.”
“Excellent resource for developing community projects and bonds.”
“To connect with other arts organizations.”
“I could refer new chorus and orchestra members and future concert goers to the material in the archive.”
“Share it with audience members”
“Promotion of Oakville”
“I would refer to it in marketing and correspondence, for artists I am involved with.”
“If allowed, to continue providing updated images.”
“I’d personally love to read others’ stories, use them for research when writing, and to teach.”
“It would be interesting to see the history of such things as the Joshua Creek Art Centre, and the OAC itself.”
“sharing it with new members of our various organizations, providing a link to it on our website and social media.”
“I would refer to it in our literature and on our web sites to draw people's attention to this resource.”
We asked respondents to rank various potential components of the project. In order of preference, 50% or more of respondents were interested in:
Profiles of Oakville-related artists and arts organizations (84.85%)
Virtual exhibits exploring aspects of Oakville's art history (81.82%)
Oral history interviews with Oakville artists and patrons (78.79%)
Video tours of Oakville cultural venues, landmarks, or studios (69.70%)
Success stories about Oakvillians (including students) that made it big (69.70%)
Educational resources for classroom use (60.61%)
Audience-submitted stories or memories in text, audio, or video formats (57.58%)
Audience-submitted materials, such as event or exhibit photographs (54.55%)
Interactive map of Oakville cultural locations over time (51.52%)
Historical magazines, periodicals, and newspaper articles (51.52%)
Finally, we asked people to share their thoughts on any of the topics we’d covered, or the project in general:
“I think this project is an excellent idea and please feel free to call on me at any time to participate.”
“It would be a good idea to emphasize newcomers and diversity - showing us to be a welcoming community.”
“The concept of arts organizations should be interpreted broadly to include educational organizations that deal with digital arts, for example, and not be restricted to graphic art such as painting. The administration of arts organizations should be included.”
“Would like to see profiles of Oakville based artists who are internationally recognized as I am not aware of many.”
“Your project is a great idea. Now, bring it to life.”
“Focus first on gathering the available information together before embarking on expensive or grandiose projects like documentaries and professional editing.”
0 notes
ayushi-blog · 5 years
Photo
Tumblr media
Heart Attack Prediction using IoT and machine learning
These days various people are misplacing their life inferable from coronary episode and lack of therapeutic consideration regarding understanding at right stage. Thus, in this task we are executing pulse checking and coronary episode acknowledgment framework utilizing IoT. The patient will convey equipment having sensors with android application. The heartbeat sensor will permit checking heartbeat readings and transmit them over the web. The client may set the high and low degree of heartbeat limits. When these points of confinement are set the framework can begin checking the patient’s pulse and when the heartbeat readings go above or underneath the farthest point set by the client the framework will send an alarm about high or low heartbeat too about odds of respiratory failure. Presently days the coronary illness is the main source of death
around the world. It is a mind-boggling errand to foresee the respiratory failure for a therapeutic specialist since it is required more understanding and information. Be that as it may, pulse checking is the most significant size of estimation that is the impact factor for respiratory failure with other wellbeing wellness like circulatory strain, serum cholesterol and level of glucose. In the period of quick insurgency of Internet of things (IoT), the sensors for checking heart rate are developing in accessibility to patients. In this project, I clarified the engineering for blood pressure and other information observing strategy and I additionally disclosed how to utilize a
AI strategy like kNN characterization calculation to anticipate the respiratory failure by utilizing the gathered pulse information also, other wellbeing related border.
Heart is the significant piece of our body and its effective working is important to direct different pieces of human body, for example, kidney, cerebrum and so forth. Henceforth for living long and sound life appropriate consideration and sharpness about these ailments is basic. The principal question generally emerge at the top of the priority list is which is the most effortless and quickest method for this? So, the appropriate response is customary registration and appropriate wellbeing diet. Be that as it may, this isn’t adequate for care and readiness.
In the cutting-edge world, as the cardiovascular illnesses are the most noteworthy flying ailments, so we ought to likewise have bounce on a few procedures and techniques utilized for readiness and care. For this reason, we build up a choice help (PC based) based data framework which will encourage the right analysis with diminished expense. This reconciliation of existing medicinal choice emotionally supportive network with various information mining methods requires the examination of a few digging systems for extricating the reasonable information for said work.
Here we built up a module which predicts the probability of danger of having heart infections utilizing Data Mining procedure with Wireless Sensor Network (WSN).first of everything we can gather various information records of patients from UCI database and concentrates the fundamental attributes required for forecast.
As the project was only done by a single member that is me, I used the Software Development Life Cycle in which the first step was of Requirement Analysis that is I read a lot of research papers on the same problem and the solution of the same. Further I studied about the requirements that is the things and the knowledge that will be required for the same. The next part was to Design that is  figure how to make the model or the device is going to be made with all the sensors and knowledge about the same. The further project was of Implementation which was about putting everything in model. I gathered all the information and made the product. Then the Testing part was done which was to get the result or attributes and put it in the model which was made in the last step so that it could be used for the testing purpose. The training was done on the information available in the UCI dataset and testing was done with the real time dataset. Further all the available options were found that could help in the future of this problem. Hence the normal software development lifecycle was used so that every process could be done one by one. It was done with the waterfall aspect, which is a course SDLC model, in which improvement procedure resembles the stream, moving bit by bit through the periods of investigation, anticipating, acknowledgment, testing, execution, and backing. This SDLC model incorporates continuous execution of each stage totally. This procedure is carefully recorded and predefined with highlights expected to each period of this product improvement life cycle model.
Heartbeat sensor is intended to give advanced yield of heartbeat when a finger is put on it. At the point when the heartbeat indicator is working, the beat LED flashes as one with every heartbeat. This computerized yield can be associated with processor straightforwardly to gauge the Beats Per Minute. The fundamental arrangement of the arranged framework is to supply higher related affordable wellbeing administrations to the patients by executing an arranged data cloud, so the experts and specialists may fabricate utilization of this learning and supply a brisk and an efficient goal to the patients. A definitive model is outfitted with the alternatives any place specialist will look at the persistent from wherever and whenever. So the e-social insurance model exploitation IOT would act as partner prudent guide to wiped out people groups following to diminish lines in emergency clinics also, conjointly direct conference with specialists. it conjointly diminishes the reliance on overseers. Elevating to benefit for all classifications of people who will access the gadget through pre-enrollment technique and patients will utilize it regularly to live the BP level, glucose level, Heartbeat rate and temperature level and it keeps from taking to genuine stage. It conjointly has propelled patient profile the executives with verified encoding and cryptography. So, it is utilized as a universally useful gadget since it will allow numerous clients. This arranged framework would be prudent to address wellbeing the executives in government emergency clinics owing to the use of web application utilized totally.
0 notes
amazingreligion · 5 years
Text
Human Evolution and Islam
N.B That’s not a new topics ! It’s a part of an archived thread originally posted on sunniforum. This is a thread for this topic and how Islamically we can explain it. It is only for Muslims whilst Muslim biologists are specially invited to participate. It is incumbent upon the Muslim biologists and those in the field to refute those who use biological ideas against Islam. Otherwise don't blame those Muslims who are not biologists yet who take on this task. Here's a start that i wrote elsewhere: Ok after having read a lot since that time, particularly from pro-evolutionists-i'm not convinced by human evolution whilst the topic of other species, it is highly probable that macro-evolution(in case some are wondering, actually some evolutionists accept the distinction) to some extent took place although current evolutionary theory is seriously deficient in its explanations and must be improved-, i've come up several observations: 1) Evolutionists at times have used outright deception to propagate their views(such as Haeckal's embryos) 2) Evolutionists have been very gullible at accepting "evidences" that have turned out to be wrong(the "Piltdown man,"  the "peppered moths", the Miller-Urey experiments etc) 3) Evolutionists have a lot of evangelicals(atheists and agnostics) who are trying to promote their views against theists and thus have no sense of accountability for what they say-provided their not shown to be wrong- and this has been shown by their blatant lies at times. 4) Christians opposing evolution have also used lies and shown ignorance of the topic 5) Thus neither sides have been trustworthy(especially the atheists) 6) Considering the methodology of Islam in accepting information,-more important in this case as the issue affects Islam- i propose that Muslim scientists don't accept information on the topic where the opinion is against Islam(especially when there's a strong bias by the kuffar) but should set stringent criteria for verifying the information and must verify it themselves. 7) The conclusion is that the information that the kuffar provide on the topic of evolution(where is contradicts Islam i.e. human evolution) is not accepted at the current moment until it is verified by trustworthy Muslim scientists. 8) Also Muslim scientists should aim to refute those kuffar who oppose Islamic beliefs through science. Harun Yahya(although not a scientist and i disagree with his works) however must be commended(contrary to the useless Muslim scientists who have done nothing but complain about him) as he has set the groundwork and now the real Muslim scientists should take over and modify and strengthen his arguments. Note that i've talked with a science teacher(who is a biologist and has a masters degree) and he basically said that there's a number of serious problems with the theory but its a developing theory so there's no guarantee on some of the things it says. And he rejected human evolution.
Last edited by loveProphet; 25-06-2012 at 09:26 PM. A thought on human evolution, one thing that we expect if we're created differently to the rest of creation is that we should have unique things. Of course it is obvious in our intelligence(it is the highest) and other behavioural aspects but lets look at the physical aspects. Also why did humans supposedly ditch the trees and the tail? Before it was suggested that being bipedal involved less energy but now its shown to not be the case. Anyways some stuff i picked up from page 7 and onwards: http://www.arn.org/docs/luskin/cl_fa...gentdesign.pdf Sure i'm not fond of ID but they've got neat references that can be checked up. Another study wrote, “We, like many others, interpret the anatomical evidence to show that early H[omo] sapiens was significantly and dramatically different from earlier and penecontemporary australopithecines in virtually every element of its skeleton and every remnant of its behavior.” J. Hawks, K. Hunley, L. Sang-Hee, and M. Wolpoff, “Population Bottlenecks and Pleistocene Evolution,” Journal of Molecular Biology and Evolution, Vol. 17(1): 2-22 (2000). One commentator proposed this evidence implies a "big bang theory" of human evolution. New study suggests big bang theory of human evolution The famed late evolutionary paleontologist Stephen Jay Gould noted that "most hominid fossils, even though they serve as a basis for endless speculation and elaborate storytelling, are fragments of jaws and s****s of skull" A Harvard evolutionary paleoanthropologist recently stated in the New York Times that newly discovered hominid fossils "show 'just how interesting and complex the human genus was and how poorly we understand the transition from being something much more apelike to something more humanlike.'" Fossils in Kenya Challenge Linear Evolution [ "Other paleontologists and experts in human evolution said the discovery strongly suggested that the early transition from more apelike to more humanlike ancestors was still poorly understood. " And see: Fossil find pushes human-ape split back millions of years "we know nothing about how the human line actually emerged from apes.” Ok so i went through sciencedaily.com some time ago to see what features are unique to humans apart from the soul. I've found out about the brain and humans walking but now i saw this: What Is The Cognitive Rift Between Humans And Other Animals? No Easy Answers In Evolution Of Human Language Complexity Constrains Evolution Of Human Brain Genes Now fit this in with the Islamic idea of man being created differently. On the other hand more on the ERVs: Ancient Retroviruses Spurred Evolution Of Gene Regulatory Networks In Humans And Other Primates Using the tools of computational genomics, the UCSC team gathered compelling evidence that retroviruses helped out. It can be used as an argument that Allah put them there for our benefit. More like a common plan is why you see them at the same loci on the same chromosomes in the different species. Also see for HERVs: Retroviruses Shows That Human-Specific Variety Developed When Humans, Chimps Diverged More: Do orthologous gene phylogenies really support tree-thinking? Results Heat map analyses were used to investigate the congruence of orthologues in four datasets (archaeal, bacterial, eukaryotic and alpha-proteobacterial). We conclude that we simply cannot determine if a large portion of the genes have a common history. In addition, none of these datasets can be considered free of lateral gene transfer. Conclusion Our phylogenetic analyses do not support tree-thinking. These results have important conceptual and practical implications. We argue that representations other than a tree should be investigated in this case because a non-critical concatenation of markers could be highly misleading Originally Posted by ahsanirfan as salam `alaykum I shall respond, but not now. jazak Allahi khayrun for alerting me to it. Insha allah, keep adding whatever you know and I will be sure to read up on it. I took out three books today from the library: Behe, Michael - Darwin's Black Box - I've read this before, but I plan to read it again. Behe, Michael - The Edge of Evolution Gould, Stephen Jay - Punctuated Equilibrium - This is about how there are gaps in the fossil record Let me know if you have more resources that I can look up, insha Allah. I will Insha'Allah write up more when i have time. But Jay Gould's book is definitely great, he started the movement against gradualism(its weak in palaeontology) although he was an atheist. Jeffrey Schwartz has taken the lead, nevertheless they still believe in evolution(and i have no problem with it except with human evolution). As for Michael Behe and a lot of the IDists, they support human evolution so this stuff is of no use to us on this issue. So don't waste your time reading those two although i have the new book by him(got it today from the library called biochemical challenge). What we have to do is really create an Islamic perspective of this. For this the first thing we need is the different Islamic material(Qur'an, Hadith etc) on the creation of Adam(AS) and then we can make logical predictions from them so that we can atleast know what to look for. Though i might read this: http://www.amazon.co.uk/Why-Fly-Hors...3647641&sr=8-1 Since however you're not going to be studying biology, you might also want to read this(its simple for laymen to understand): http://www.amazon.co.uk/Evolution-Du...3647683&sr=1-1 There is a discussion over punctuated equilibrium and gradualism in this peer-reviewed article: http://www.discovery.org/articleFiles/PDFs/Cambrian.pdf This is only because you mentioned Gould's book. I don't the evolution of non-humans to be a discussion here. Remember we're not after evolution of non-humans so don't get too distracted! Last edited by loveProphet; 16-06-2008 at 08:28 PM. Discussions with Christians and Debate Human evolution is not supported by the fossil evidence. Much of the alleged evidence that filled text books over the last 50 years has now been reclassified or rejected altogether. The missing links are still missing. Human Evolution: The Legacy of the Fossil Evidence Human evolution has many issues, including the realities of genetics, biochemistry, design theory, irreducible complexity, DNA structure, and information systems. However, the reality of the human fossil record alone is enough to reject the theory of human evolution all together. Here are just a few of the major problems with the alleged fossil record of the past century: Ramapithecus was widely recognized as a direct ancestor of humans. It is now established that he was merely an extinct type of orangutan. Piltdown man was hyped as the missing link in publications for over 40 years. He was a fraud based on a human skull cap and an orangutan's jaw. Nebraska man was a fraud based on a single tooth of a rare type of pig. Java man was based on sketchy evidence of a femur, skull cap and three teeth found within a wide area over a one year period. It turns out the bones were found in an area of human remains, and now the femur is considered human and the skull cap from a large ape. Neandertal man was traditionally depicted as a stooped ape-man. It is now accepted that the alleged posture was due to disease and that Neandertal is just a variation of the human kind. Human Evolution:  Human evolution has its currently fashionable specimens that lead from small ape-like creatures to Homo sapiens. These are examples of the most recent alleged links: Australopithecus afarensis, or "Lucy," has been considered a missing link for years. However, studies of the inner ear, skulls and bones have shown that she was merely a pygmy chimpanzee that walked a bit more upright than some other apes. She was not on her way to becoming human. Homo erectus has been found throughout the world. He is smaller than the average human of today, with a proportionately smaller head and brain cavity. However, the brain size is within the range of people today and studies of the middle ear have shown that he was just like current Homo sapiens. Remains are found throughout the world in the same proximity to remains of ordinary humans, suggesting coexistence. Australopithecus africanus and Peking man were presented as ape-men missing links for years, but are now both considered Homo erectus. Homo habilis is now generally considered to be comprised of pieces of various other types of creatures, such as Australopithecus and Homo erectus, and is not generally viewed as a valid classification. Human Evolution: The Most Recent Find In July 2002, anthropologists announced the discovery of a skull in Chad with "an unusual mixture of primitive and humanlike features." The find was dubbed "Toumai" (the name give to children in Chad born close to the dry season) and was immediately hailed as "the earliest member of the human family found so far." By October 2002, a number of scientists went on record to criticize the premature claim -- declaring that the discovery is merely the fossil of an ape. Human Evolution: The Theory Has No Support in the Fossil Record Human evolution is a theory in denial. With all of this fossil evidence (or lack thereof) it becomes increasingly clear to an earnest seeker that human evolution did not happen at all. • Lack of Transitional Fossils. Charles Darwin wrote, "Lastly, looking not to any one time, but to all time, if my theory be true, numberless intermediate varieties, linking closely together all the species of the same group, must assuredly have existed. But, as by this theory, innumerable transitional forms must have existed, why do we not find them embedded in countless numbers in the crust of the earth?" (Origin of Species, 1859). Since Darwin put forth his theory, scientists have sought fossil evidence indicating past organic transitions. Nearly 150 years later, there has been no evidence of transition found thus far in the fossil record. • Lack of a Natural Mechanism. Charles Darwin, in his Origin of Species, proposed Natural Selection to be the mechanism by which an original simple-celled organism could have evolved gradually into all species observed today, both plant and animal. Darwin defines evolution as "descent with modification." However, Natural Selection is known to be a conservative process, not a means of developing complexity from simplicity. Later, with our increased understanding of genetics, it was thought perhaps Natural Selection in conjunction with genetic mutation allowed for the development of all species from a common ancestor. However, this is theoretical and controversial, since "beneficial" mutations have yet to be observed. In fact, scientists have only observed harmful, "downward" mutations thus far. N. Heribert Nilsson, a famous botanist, evolutionist and professor at Lund University in Sweden, continues: My attempts to demonstrate evolution by an experiment carried on for more than 40 years have completely failed… The fossil material is now so complete that it has been possible to construct new classes, and the lack of transitional series cannot be explained as being due to scarcity of material. The deficiencies are real, they will never be filled. 4 Even the popular press is catching on. This is from an article in Newsweek magazine: The missing link between man and apes, whose absence has comforted religious fundamentalists since the days of Darwin, is merely the most glamorous of a whole hierarchy of phantom creatures … The more scientists have searched for the transitional forms that lie between species, the more they have been frustrated. Is it enough to prove that the human evolution is not possible? As I have already mentioned that in Quraan it is cleared stated that: All human are created from the single pair (ie. Adam and Hawwa) And still today the science is not advance to prove this.  So Quraan is superior to the science. Realistically, if Darwin's theory can't begin to explain the 'evolution' of a system as simple as a ten part mouse trap, what hope has it got in explaining the development of the complex biochemistry associated with a single cell organism, let alone higher life forms? The Test Commandment: Sabbath matter Now examine the account in Exodus 16:1-30. The people of Israel were "murmuring" against God because they wanted more food. So God said, "I will... TEST them, whether they will walk in My LAW or not" (v. 4) Remember that this was a TEST—to see whether they would follow God's law or not. So what did the people do?     As human beings so often do, they did NOT take God seriously! Some Israelites went out and tried to find manna even on the Sabbath. And the only link between the human and the monkey was explained in the Holy Quraan is: And indeed you knew those amongst you who transgressed in the matter of the Sabbath (i.e. Saturday). We said to them: "Be you monkeys, despised and rejected."___(Surah Al baqarah-Verse # 65) So When Allah rejected them and curse them to be monkeys, then is it not possible that those unbelievers turned into the monkeys or ape.  And even if in the future the missing link between the human and monkey is found, it has to be of one of the unbeliver. Ok i found another interesting quote: Considering the very close genetic relationship that has been established by comparison of biochemical properties of blood proteins, protein structure and DNA and immunological responses, the differences between a man and a chimpanzee are more astonishing than the resemblances... Something must have happened to the ancestors of Homo sapiens which did not happen to the ancestors of gorillas and chimpanzees Elain Morgan, The Aquatic Ape: A Theory of Human evolution
0 notes
brianobrienny · 4 years
Text
How to Use Data to Make Your Lead Generation More Effective
Data is truly king in today’s content-driven world. Businesses rely on it to support marketing strategies, target their customers, make smarter business decisions, and predict future outcomes.
One of the greatest things about becoming a data-driven business is that it can make incredible improvements from the top to the bottom of the sales funnel – and lead generation is no exception here.
A LinkedIn report on B2B lead generation found that funneling in quality leads is a top priority for 68% of B2B organizations. It’s the top challenge for 59% of marketers as well.
What’s more, the second and third biggest barriers to lead generation are due to a lack of data as well.
Source
Therefore, using data to research and reach your audience, identify prospects, and generate leads is a no-brainer for any B2B organization today. However, this is by no means an easy task. “Data-driven” strategy is little more than a buzzword that is often thrown around or tacked onto SaaS tools with little more to it than a marketing ploy.
Without a doubt, data needs to play a key role in your company’s lead generation strategy if you’re going to be truly successful in making sales. So how can you be sure that you are not only gathering high-quality data but also using it effectively to its fullest extent?
1. Use CRM Data for Lead Scoring
Most businesses get so caught up in trying to boost the number of leads they are generating that they fail to notice a much more important metric: the relevancy of these leads. Even if you’re funneling in a hundred more leads a day, if very few of them are interested or have the propensity to convert, then it could be a colossal waste of your team’s time and efforts. A data-driven lead generation process has a single goal: to increase the number of leads that move to the next stage.
For example, better top-of-the-funnel data is the key to generating better quality leads at the outset. In order to collect enough behavioral data to create an accurate scoring system, be sure that your marketing and sales departments are using a smart, automated (preferably AI-based) CRM system that keeps track of all past customer interactions and movements. You want to be able to identify patterns that indicate qualified leads so that your team’s efforts to convert them can be more strategic.
A smart CRM can use data from past consumer behavior to predict the current likelihood of conversions. For example, if 45% of leads that download your business’s white paper eventually convert, then a future lead that follows the same behavior pattern should deserve more attention from your sales team.
This brings us to the importance of the latest technological developments in martech.
2. Make Technology Your Friend
The only way that you can be absolutely sure that your generation strategies are effectively data-driven is ensuring that your data is truly accurate.
The quality of your data is also incredibly crucial. Here, artificial intelligence (AI) based systems can be incredibly useful in processing huge volumes of data, verify them for accuracy, and categorizing them into workable chunks so that your team can identify specific areas or audience segments to target and nurture.
Further, the more frequently updated that your customer datasets are, the better. It is a good idea to conduct a thorough audience analysis upfront so that you’re absolutely sure of the characteristics of leads to look out for. You can do this with internal datasets and research, or using an audience segmentation and selection tool that lets you analyze, categorize, and narrow down your target audience.
AI combined with predictive technology provides businesses with a 360-degree perspective on leads, and automatically zero in on the ones with the highest propensity to buy from you. Essentially, these systems can “learn” from the past and predict audience behavior to engage relevant audiences and induce them to take actions that move them down your sales funnel.
For example, AI tool Node dives deep into massive databases, analyzes your Total Addressable Market (TAM) and uses NLP (natural language processing), identifies potential leads and assigns them to the reps who’re best suited to dealing with them for further nurturing, and also predicts renewals and churns.
Another tool, Growbots, uses a similar approach to generating leads but is more focused on social media audiences. It even creates a self-updating database with individual customer profiles and offers targeted, automated, and personalized email campaigns (with follow-ups) for lead nurturing.
3. Personalize the Lead Nurturing Process
Personalization is the epitome of data-driven marketing and an emotion and empathy-based approach to engaging your audience. It is a highly effective approach to both lead generation and nurturing because of the profound effect it has on consumers. In fact, a study by Epsilon found that 4 in 5 customers are more likely to do business with a company that offers personalized experiences or messaging.
The power of me: The impact of personalization on marketing performance from Epsilon Marketing
In order to attract new leads with personalization, you will first need to have a solid understanding of your core audience and their typical motivations (price, quality, benefits, and so on). But also remember that not every customer is going to follow the same path down the buyer’s journey. For instance, a customer that arrives to your site via an organic search may prefer different content than one that is referred from another client or who clicks on a paid ad.
The best way to make a personalized lead generation strategy possible is by creating buyer journey maps that are founded on past data. Analyze your website’s content and see which pages are driving in converting traffic. This will help you understand which specific marketing strategies help nudge customers from different sources on to the next phase. You can then create an automated trigger-based campaigns that nurture leads by sending them personalized content based on their past interactions and browsing behavior.
More Data, More Leads; Better Data, Better Leads
A data-driven lead generation approach is really the only way for modern businesses to go if they want to stay competitive. But the key to making it truly effective is knowing not just how to gather this data, but also to apply it.
Be sure that no matter what, your team is using systems that are gathering, analyzing and nurturing leads based on accurate, updated information so that no opportunities are slipping through the cracks. A simple guide to follow would be:
Understanding your audience and knowing what can make them buy from you
Constantly experimenting with different tools and technology that might help you reach a wider audience and target the best prospects
Personalizing your messaging to every customer with content that is relevant and contextual to their situation
What data do you use for targeting and lead generation? Does your CRM provide you with better data that improves the quality of your leads? How do you personalize your content? Let’s discuss strategies in the comments!
If you are ready to get more traffic to your site with quality content that’s published consistently, check out our Content Builder Service. Set up a quick consultation, and I’ll send you a free PDF version of my books. Get started today–and generate more traffic and leads for your business.
The post How to Use Data to Make Your Lead Generation More Effective appeared first on Marketing Insider Group.
How to Use Data to Make Your Lead Generation More Effective published first on http://rssmix.com/u/11592782/rss.xml
0 notes
scienceblogtumbler · 4 years
Text
Applying lessons from Ancient Rome to address human trafficking today
When Jenny Vo-Phamhi was researching a paper for Professor Richard Saller’s seminar about the Roman Empire during her very first quarter at Stanford, she was stunned to discover a problem that has changed little over 2,000 years: human trafficking.
Stanford has postponed its formal Commencement and will hold a virtual celebration for graduates on Sunday, June 14. That celebration won’t be a substitute for the venerable tradition. Stanford plans to hold an in-person graduation when a gathering on campus is possible again.
“I was so disturbed by how little human trafficking has changed,” recalled Vo-Phamhi. She found herself eager to know more. Why does the problem continue to persist? Who is most at risk? What do the mechanisms of trafficking look like? And how can studying trafficking in Ancient Rome help us understand the problem today?
Vo-Phamhi, who is graduating from Stanford this weekend with a double major in classics and computer science, pursued a broad range of interests as an undergraduate. She’s interned at the Chan-Zuckerburg Biohub, where she helped build an mRNA imaging toolkit using machine learning, and for NASA, where she worked on instrumentation for the Arc Jet Complex. She’s also worked with Justin Leidwanger, an assistant professor of classics, where she developed computational tools to analyze artifacts used in ancient trade networks.
But the questions she had about human trafficking in Ancient Rome stayed on her mind. They eventually became the focus of her senior honors thesis and her Hume Humanities Honors Fellowship at the Stanford Humanities Center.
“I kept finding myself drawn back to the topic of human trafficking, ancient and modern, from different angles,” said Vo-Phamhi.
Searching for historical clues
While slavery was legal in ancient Rome, people were still trafficked illegally, said Vo-Phamhi.
When she set out to learn more about it, she encountered an issue: there is very little information about it.
“Rome couldn’t have functioned without slaves, yet they’re almost like ghosts in the evidence,” said Vo-Phamhi, “You hardly ever see them substantially mentioned outside law books or as a caricature in Ancient Roman comedies, which don’t always do a great job representing their lived experiences.”
In addition to looking at the few available sources in law and literature, Vo-Phamhi had to piece together their stories herself. She turned to archaeology, census data and even pored through the marble archives of the Epigraphical Museum of Athens in an independent research trip funded by a grant from the Department of Classics.
On that same research trip, Vo-Phamhi visited an archaeological site on the island of Delos, an ancient epicenter of human trafficking where some 10,000 slaves were said to pass through each day. Vo-Phamhi learned how Roman slaves came from all over the world, and with so many slaves passing through daily, it would have been impossible to distinguish who was legal and wasn’t.
As Vo-Phamhi described in her thesis, “In the Roman world, where most types of trafficking were legal and law enforcement was passive, the benefits of conducting trade in spaces highly frequented by customers gave rise to centralized marketplaces where legitimate and illegitimate traffickers could openly gather.”
Combining classics and computer science
While trafficking in Ancient Rome happened so openly, in the U.S. today where trafficking is illegal, its underground operations make it incredibly challenging to study.
Vo-Phamhi took this photo during a research trip to the archaeological site on the Greek island Delos, an epicenter of ancient human trafficking. In the background is the ancient harbor where as many as 10,000 slaves were unloaded and sold in a single day. With so much activity, illegal traffickers could easily blend in. “This photo has been anchoring my thesis,” said Vo-Phamhi. “Also, it basically sums up my classics research experience these past four years: precious, dusty ruins (here, marble) amidst dry grass against a backdrop of blue sea and sky.” (Image credit: Jenny Vo-Phamhi)
To assemble a fuller picture of what trafficking looks like in the U.S. today, Vo-Phamhi had to turn to diverse sources for information – including the dark web – where much of the modern slave trade operates.
This was where Vo-Phamhi was able to tap into her experiences as a computer science major.
For the past five months, Vo-Phamhi has worked with Jared Dunnmon, a postdoctoral researcher in a group led by Christopher Ré, an associate professor in the computer science department. Together, the group built a machine learning tool to analyze a massive dataset of advertisements seized by the U.S. Department of Justice (DOJ) when they shut down a website connected to illegal activities, including money laundering and human trafficking. Vo-Phamhi is mining this data for clues about how the trade network operates that she hopes eventually will provide information to help catch perpetrators.
Vo-Phamhi also analyzed publicly-available datasets from the U.S. government, including material made available by the DOJ. She also gathered as many news articles about charges and convictions of traffickers as she could to pull together a profile of who was at risk.
Vo-Phamhi learned that, much like human trafficking victims in Ancient Rome, targets today are people from areas of armed conflict, people who are economically disadvantaged, and people who are desperate to improve circumstances for themselves or their families.
“A better understanding of who’s affected and how is essential for helping people get out of these dangerous situations,” Vo-Phamhi said.
Helping identify trafficking victims
Vo-Phamhi wants to do more than just research the problem – she wants to help solve it.
When Vo-Phamhi learned that up to 80% of trafficking victims in the U.S. encounter a healthcare provider while still being trafficked, she saw an opportunity for intervention.
For the past year, Vo-Phamhi has worked with doctors in emergency medicine and primary care at Stanford Hospital and community clinics around the Bay Area and the U.S. to create an awareness campaign to help healthcare professionals realize when a trafficking victim may be in their care. Vo-Phamhi met with doctors, nurses, lawyers, law enforcement, and advocacy groups as she developed signage and pamphlets for healthcare workers as well as victims. Information includes what signs to look for, what questions to ask and what to do next.
“A major takeaway for me from the dark web work has been that it’s really hard to identify trafficked people,” Vo-Phamhi said. “So it’s extremely important to identify them on those rare occasions when they come into direct contact with people who can help.”
Mentorship, realizing limitless opportunities
Vo-Phamhi credits her accomplishments to the support of her thesis advisor and mentor, Richard Saller, the Kleinheinz Family Professor of European Studies in the School of Humanities and Sciences.
“Professor Saller encourages me to go wherever my research leads if there is important work that needs to be done,” Vo-Phamhi said. “He believes I should apply my education and training to society’s largest problems without worrying about boundaries between fields. During my first two years at Stanford, he was dean of the School of Humanities and Sciences, and in the course of our conversations, he became a leadership role model for me.”
After graduation, Vo-Phamhi is dedicating a year for research and community-engaged work.
She also plans to apply to graduate school.
“I will be thinking deeply about how I can best use my education to serve society,” said Vo-Phamhi. “As the pandemic has shown, the world needs thoughtful and capable leaders at all levels. Dr. Anthony Fauci often attributes his wisdom to his classics training. There’s profound value in that long view of humanity over the millennia.”
source https://scienceblog.com/516871/applying-lessons-from-ancient-rome-to-address-human-trafficking-today/
0 notes
caplofan · 4 years
Text
The Unreasonable Importance of Data Preparation in 2020
youtube
The Unreasonable Importance of Data Preparation in 2020
In a world focused on buzzword-driven models and algorithms, you’d be forgiven for forgetting about the unreasonable importance of data preparation and quality: your models are only as good as the data you feed them.
This is the garbage in, garbage out principle: flawed data going in leads to flawed results, algorithms, and business decisions. If a self-driving car’s decision-making algorithm is trained on data of traffic collected during the day, you wouldn’t put it on the roads at night.
To take it a step further, if such an algorithm is trained in an environment with cars driven by humans, how can you expect it to perform well on roads with other self-driving cars?
Beyond the autonomous driving example described, the “garbage in” side of the equation can take many forms—for example, incorrectly entered data, poorly packaged data, and data collected incorrectly, more of which we’ll address below.
When executives ask me how to approach an AI transformation, I show them Monica Rogati’s AI Hierarchy of Needs, which has AI at the top, and everything is built upon the foundation of data (Rogati is a data science and AI advisor, former VP of data at Jawbone, and former LinkedIn data scientist):
AI Hierarchy of Needs 2020
Image courtesy of Monica Rogati, used with permission.
Why is high-quality and accessible data foundational?
If you’re basing business decisions on dashboards or the results of online experiments, you need to have the right data.
On the machine learning side, we are entering what Andrei Karpathy, director of AI at Tesla, dubs the Software 2.0 era, a new paradigm for software where machine learning and AI require less focus on writing code and more on configuring, selecting inputs, and iterating through data to create higher level models that learn from the data we give them.
In this new world, data has become a first-class citizen, where computation becomes increasingly probabilistic and programs no longer do the same thing each time they run.
The model and the data specification become more important than the code.
Collecting the right data requires a principled approach that is a function of your business question.
Data collected for one purpose can have limited use for other questions.
The assumed value of data is a myth leading to inflated valuations of start-ups capturing said data. John Myles White, data scientist and engineering manager at Facebook, wrote:
The biggest risk I see with data science projects is that analyzing data per se is generally a bad thing.
Generating data with a pre-specified analysis plan and running that analysis is good. Re-analyzing existing data is often very bad.”
John is drawing attention to thinking carefully about what you hope to get out of the data, what question you hope to answer, what biases may exist, and what you need to correct before jumping in with an analysis[1].
With the right mindset, you can get a lot out of analyzing existing data—for example, descriptive data is often quite useful for early-stage companies[2].
Not too long ago, “save everything” was a common maxim in tech; you never knew if you might need the data. However, attempting to repurpose pre-existing data can muddy the water by shifting the semantics from why the data was collected to the question you hope to answer. In particular, determining causation from correlation can be difficult.
For example, a pre-existing correlation pulled from an organization’s database should be tested in a new experiment and not assumed to imply causation[3], instead of this commonly encountered pattern in tech:
A large fraction of users that do X do Z Z is good Let’s get everybody to do X
Correlation in existing data is evidence for causation that then needs to be verified by collecting more data.
The same challenge plagues scientific research. Take the case of Brian Wansink, former head of the Food and Brand Lab at Cornell University, who stepped down after a Cornell faculty review reported he “committed academic misconduct in his research and scholarship, including misreporting of research data, problematic statistical techniques [and] failure to properly document and preserve research results.” One of his more egregious errors was to continually test already collected data for new hypotheses until one stuck, after his initial hypothesis failed[4]. NPR put it well: “the gold standard of scientific studies is to make a single hypothesis, gather data to test it, and analyze the results to see if it holds up. By Wansink’s own admission in the blog post, that’s not what happened in his lab.” He continually tried to fit new hypotheses unrelated to why he collected the data until he got a null hypothesis with an acceptable p-value��a perversion of the scientific method.
Data professionals spend an inordinate amount on time cleaning, repairing, and preparing data
Before you even think about sophisticated modeling, state-of-the-art machine learning, and AI, you need to make sure your data is ready for analysis—this is the realm of data preparation. You may picture data scientists building machine learning models all day, but the common trope that they spend 80% of their time on data preparation is closer to the truth.
common trope that data scientists spend 80% of their time on data preparation 2020
This is old news in many ways, but it’s old news that still plagues us: a recent O’Reilly survey found that lack of data or data quality issues was one of the main bottlenecks for further AI adoption for companies at the AI evaluation stage and was the main bottleneck for companies with mature AI practices.
Good quality datasets are all alike, but every low-quality dataset is low-quality in its own way[5]. Data can be low-quality if:
It doesn’t fit your question or its collection wasn’t carefully considered; It’s erroneous (it may say “cicago” for a location), inconsistent (it may say “cicago” in one place and “Chicago” in another), or missing; It’s good data but packaged in an atrocious way—e.g., it’s stored across a range of siloed databases in an organization; It requires human labeling to be useful (such as manually labeling emails as “spam” or “not” for a spam detection algorithm).
This definition of low-quality data defines quality as a function of how much work is required to get the data into an analysis-ready form. Look at the responses to my tweet for data quality nightmares that modern data professionals grapple with.
The importance of automating data preparation
Most of the conversation around AI automation involves automating machine learning models, a field known as AutoML.
This is important: consider how many modern models need to operate at scale and in real time (such as Google’s search engine and the relevant tweets that Twitter surfaces in your feed). We also need to be talking about automation of all steps in the data science workflow/pipeline, including those at the start. Why is it important to automate data preparation?
It occupies an inordinate amount of time for data professionals. Data drudgery automation in the era of data smog will free data scientists up for doing more interesting, creative work (such as modeling or interfacing with business questions and insights). “76% of data scientists view data preparation as the least enjoyable part of their work,” according to a CrowdFlower survey.
A series of subjective data preparation micro-decisions can bias your analysis. For example, one analyst may throw out data with missing values, another may infer the missing values. For more on how micro-decisions in analysis can impact results, I recommend Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results[6] (note that the analytical micro-decisions in this study are not only data preparation decisions).
Automating data preparation won’t necessarily remove such bias, but it will make it systematic, discoverable, auditable, unit-testable, and correctable. Model results will then be less reliant on individuals making hundreds of micro-decisions.
An added benefit is that the work will be reproducible and robust, in the sense that somebody else (say, in another department) can reproduce the analysis and get the same results[7];
For the increasing number of real-time algorithms in production, humans need to be taken out of the loop at runtime as much as possible (and perhaps be kept in the loop more as algorithmic managers): when you use Siri to make a reservation on OpenTable by asking for a table for four at a nearby Italian restaurant tonight, there’s a speech-to-text model, a geographic search model, and a restaurant-matching model, all working together in real time.
No data analysts/scientists work on this data pipeline as everything must happen in real time, requiring an automated data preparation and data quality workflow (e.g., to resolve if I say “eye-talian” instead of “it-atian”).
The third point above speaks more generally to the need for automation around all parts of the data science workflow. This need will grow as smart devices, IoT, voice assistants, drones, and augmented and virtual reality become more prevalent.
Automation represents a specific case of democratization, making data skills easily accessible for the broader population. Democratization involves both education (which I focus on in my work at DataCamp) and developing tools that many people can use.
Understanding the importance of general automation and democratization of all parts of the DS/ML/AI workflow, it’s important to recognize that we’ve done pretty well at democratizing data collection and gathering, modeling[8], and data reporting[9], but what remains stubbornly difficult is the whole process of preparing the data.
Modern tools for automating data cleaning and data preparation
We’re seeing the emergence of modern tools for automated data cleaning and preparation, such as HoloClean and Snorkel coming from Christopher Ré’s group at Stanford.
HoloClean decouples the task of data cleaning into error detection (such as recognizing that the location “cicago” is erroneous) and repairing erroneous data (such as changing “cicago” to “Chicago”), and formalizes the fact that “data cleaning is a statistical learning and inference problem.”
All data analysis and data science work is a combination of data, assumptions, and prior knowledge. So when you’re missing data or have “low-quality data,” you use assumptions, statistics, and inference to repair your data.
HoloClean performs this automatically in a principled, statistical manner. All the user needs to do is “to specify high-level assertions that capture their domain expertise with respect to invariants that the input data needs to satisfy. No other supervision is required!”
The HoloClean team also has a system for automating the “building and managing [of] training datasets without manual labeling” called Snorkel. Having correctly labeled data is a key part of preparing data to build machine learning models[10].
As more and more data is generated, manually labeling it is unfeasible.
Snorkel provides a way to automate labeling, using a modern paradigm called data programming, in which users are able to “inject domain information [or heuristics] into machine learning models in higher level, higher bandwidth ways than manually labeling thousands or millions of individual data points.”
Researchers at Google AI have adapted Snorkel to label data at industrial/web scale and demonstrated its utility in three scenarios: topic classification, product classification, and real-time event classification.
Snorkel doesn’t stop at data labeling. It also allows you to automate two other key aspects of data preparation:
Data augmentation—that is, creating more labeled data. Consider an image recognition problem in which you are trying to detect cars in photos for your self-driving car algorithm.
Classically, you’ll need at least several thousand labeled photos for your training dataset. If you don’t have enough training data and it’s too expensive to manually collect and label more data, you can create more by rotating and reflecting your images.
Discovery of critical data subsets—for example, figuring out which subsets of your data really help to distinguish spam from non-spam.
These are two of many current examples of the augmented data preparation revolution, which includes products from IBM and DataRobot.
The future of data tooling and data preparation as a cultural challenge
So what does the future hold? In a world with an increasing number of models and algorithms in production, learning from large amounts of real-time streaming data, we need both education and tooling/products for domain experts to build, interact with, and audit the relevant data pipelines.
We’ve seen a lot of headway made in democratizing and automating data collection and building models. Just look at the emergence of drag-and-drop tools for machine learning workflows coming out of Google and Microsoft.
As we saw from the recent O’Reilly survey, data preparation and cleaning still take up a lot of time that data professionals don’t enjoy. For this reason, it’s exciting that we’re now starting to see headway in automated tooling for data cleaning and preparation. It will be interesting to see how this space grows and how the tools are adopted.
A bright future would see data preparation and data quality as first-class citizens in the data workflow, alongside machine learning, deep learning, and AI. Dealing with incorrect or missing data is unglamorous but necessary work.
It’s easy to justify working with data that’s obviously wrong; the only real surprise is the amount of time it takes. Understanding how to manage more subtle problems with data, such as data that reflects and perpetuates historical biases (for example, real estate redlining) is a more difficult organizational challenge.
This will require honest, open conversations in any organization around what data workflows actually look like.
The fact that business leaders are focused on predictive models and deep learning while data workers spend most of their time on data preparation is a cultural challenge, not a technical one. If this part of the data flow pipeline is going to be solved in the future, everybody needs to acknowledge and understand the challenge.
Original Source: The unreasonable importance of data preparation
Curated On: https://www.cashadvancepaydayloansonline.com/
The post The Unreasonable Importance of Data Preparation in 2020 appeared first on Cash Advance Payday Loans Online | Instant Payday Loans Online 2020.
source https://www.cashadvancepaydayloansonline.com/the-unreasonable-importance-of-data-preparation-in-2020/?utm_source=rss&utm_medium=rss&utm_campaign=the-unreasonable-importance-of-data-preparation-in-2020
0 notes
365datascienceblog · 4 years
Text
Data Analyst Interview Questions And Answers
Data Analyst Interview Questions And Answers
Why you should be familiar with data analyst interview questions?
If you’re aiming for a data analyst job, sooner or later, you’ll reach the final stage of the application process – the data analyst job interview. So, how can you ace the interview with ease? By being well-familiar with the data analyst interview questions in advance.
And that’s exactly why you should read this article. Here you’ll learn everything you need to nail the challenging job interview and secure a career as a data analyst:
How to prepare for the data analyst interview (the top data analyst skills you need to acquire);
A list of real-life data analyst interview questions and answers;
What the data analyst interview process in 3 leading companies looks like.
Bonus content: how to present yourself in your best light and leave a lasting impression on the interviewers. But that will come last. In the meantime…
How to be ready for a data analyst interview?
No matter where you apply for a data analyst job, no recruiter will call you in for an interview, if you don’t possess the necessary skills. And when it comes to data analysis, you can’t go without the following:
Programming and coding language skills using Python, R, etc.;
Expertise in SQL and a good understanding of how relational database management systems work;
Tableau Experience with large data sets and distributed computing;
Excellent Excel skills and ability to use advanced analytics and formulas;
Knowledge of statistics and statistical software packages, quantitative methods, confidence intervals, sampling and test/control cells.
And while we’re at it, if you want to pursue a career in data analysis but you lack the technical education and skills, we also offer a free preview version of the Data Science Program. You’ll receive 12 hours of beginner to advanced content for free. It’s a great way to check if the program is right for you.
What data analyst interview questions you should know the answers to?
General Data Analyst Interview Questions
General data analyst interview questions are not just about your background and work experience. In fact, interviewers might surprise you with questions requiring details about the projects you’ve been involved, and how you approach complex data sets. So, let’s take a look:
1. Can you share details about the largest data set you’ve worked with? How many entries and variables did the data set comprise? What kind of data was included?
How to Answer
Working with large datasets and dealing with a substantial number of variables and columns is important for a lot of hiring managers. When answering the question, you don’t have to reveal background information about the project or how you managed each stage. Focus on the size and type of data.
Example Answer
“I believe the largest data set I’ve worked with was within a joint software development project. The data set comprised more than a million records and 600-700 variables. My team and I had to work with Marketing data which we later loaded into an analytical tool to perform EDA.”
2. In your role as a data analyst, have you ever recommend a switch to different processes or tools? What was the result of your recommendation?
How to Answer
For hiring managers, it’s important that they pick a data analyst who is not only knowledgeable but also confident enough to initiate a change that would improve the company’s status quo. When talking about the recommendation you made, give as many details as possible, including your reasoning behind it. Even if the recommendation you made was not implemented, it still demonstrates that you’re driven and you strive for improvement.
Example Answer
“Although data from non-technical departments is usually handled by data analysts, I’ve worked for a company where colleagues who were not on the data analysis side had access to data. This brought on many cases of misinterpreted data that caused significant damage to the overall company strategy. I gathered examples and pointed out that working with data dictionaries can actually do more harm than good. I recommended that my coworkers depend on data analysts for data access. Once we implemented my recommendation, the cases misinterpreted data dropped drastically.”
3. How would you assess your writing skills? When do you use written form of communication in your role as a data analyst?
How to Answer
Working with numbers is not the only aspect of a data analyst job. Data analysts also need strong writing skills, so they can present the results of their analysis to management and stakeholders efficiently. If you think you are not the greatest data “storyteller”, make sure you’re making efforts in that direction, e.g. through additional training.
Example Answer
“Over time, I’ve had plenty of opportunities to enhance my writing skills, be it through email communication with coworkers, or through writing analytical project summaries for the upper management. I believe I can interpret data in a clear and succinct manner. However, I’m constantly looking for ways to improve my writing skills even further.”
4. Have you ever used both quantitative and qualitative data within the same project?
How to Answer
To conduct a meaningful analysis, data analysts must use both the quantitative and qualitative data available to them. In surveys, there are both quantitative and qualitative questions, so merging those 2 types of data presents no challenge whatsoever. In other cases, though, a data analyst must use creativity to find matching qualitative data. That said, when answering this question, talk about the project where the most creative thinking was required.
Example Answer
“In my experience, I’ve performed a few analyses where I had qualitative survey data at my disposal. However, I realized I can actually enhance the validity of my recommendations by also implementing valuable data from external survey sources. So, for a product development project, I used qualitative data provided by our distributors, and it yielded great results.”
5. What is your experience in conducting presentations to various audiences?
How to Answer
Strong presentation skills are extremely valuable for any data analyst. Employers are looking for candidates who not only possess brilliant analytical skills, but also have the confidence and eloquence to present their results to different audiences, including upper-level management and executives, and non-technical coworkers. So, when talking about the audiences you’ve presented to, make sure you mention the following:
Size of the audience;
Whether it included executives;
Departments and background of the audience;
Whether the presentation was in person or remote, as the latter can be very challenging.
Example Answer
“In my role as a Data Analyst, I have presented to various audiences made up of coworkers and clients with differing backgrounds. I’ve given presentation to both small and larger groups. I believe the largest so far has been around 30 people, mostly colleagues from non-technical departments. All of these presentations were conducted in person, except for 1 which was remote via video conference call with senior management.”
6. Have you worked in an industry similar to ours?
How to Answer
This is a pretty straightforward question, aiming to assess if you have industry-specific skills and experience. Even if you don’t, make sure you’ve prepared an answer in advance where you explain how you can apply your background skills from a different field to the benefit of the company.
Example Answer
“As a data analyst with financial background, I can say there are a few similarities between this industry and healthcare. I think the most prominent one is data security. Both industries utilize highly sensitive personal data that must be kept secure and confidential. This leads to 2 things: more restricted access to data, and, consequently, more time to complete its analysis. This has taught me to be more time efficient when it comes to passing through all the security. Moreover, I learned how important it is to clearly state the reasons behind requiring certain data for my analysis.”
7. Have you earned any certifications to boost your career opportunities as a Data Analyst?
How to Answer
Hiring managers appreciate a candidate who is serious about advancing their career options through additional qualifications. Certificates prove that you have put in the effort to master new skills and knowledge of the latest analytical tools and subjects. While answering the question, list the certificates you have acquired and briefly explain how they’ve helped you boost your data analyst career. If you haven’t earned any certifications so far, make sure you mention the ones you’d like to work towards and why.
Example Answer
“I’m always looking for ways to upgrade my analytics skillset. This is why I recently earned a certification in Customer Analytics in Python. The training and requirements to finish it really helped me sharpen my skills in analyzing customer data and predicting the purchase behavior of clients.”
Technical Data Analyst Interview Questions
Technical data analyst interview questions are focused on assessing your proficiency in analytical software, visualization tools, and scripting languages, such as SQL and Python. Depending on the specifics of the job, you might be requested to answer some more advanced statistical questions, too. Here are some real-world examples:
8. What tools or software do you prefer using in the various phases of data analysis and why?
How to Answer
Although you might think you should have experience with as many tools as possible to ace this question, this is not the case. Each company uses specific data analysis tools, so it’s normal that your expertise is limited to those. Of course, if you have worked for a large number of companies, you’re bound to have exposure to a wider variety of analytical software. That said, the interviewer would like to know which tools you feel comfortable with, rather than the number of tools you’ve utilized.
Example Answer
“When it comes to data analysis tools, I can say I’m a traditionalist. That’s why, I find Microsoft Excel and Microsoft Access most useful. I feel truly comfortable working with those, and they’re available in almost every company out there. Moreover, you can achieve great results with them with the right training.”
9. Have you ever created or worked with statistical models? If so, please describe how you’ve used it to solve a business task.
How to Answer
As a data analyst, you don’t specifically need experience with statistical models, unless it’s required for the job you’re applying for. If you haven’t been involved in building, using, or maintaining statistical models, be open about it and mention any knowledge or partial experience you may have.
Example Answer
“Being a data analyst, I can’t say I’ve had direct experience building statistical models. However, I’ve helped the statistical department by making sure they have access to the proper data and analyzing it. The model in question was built with the purpose of identifying the customers who were most inclined to buy additional products and predicting when they were most likely to make that decision. My job was to establish the appropriate variables used in the model and assess its performance once it was ready.”
10. Which step of a data analysis project do you enjoy the most?
How to Answer
It’s normal for a data analyst to have preferences of certain tasks over others. However, you’ll most probably be expected to deal with all steps of a project – from querying and cleaning, through analyzing, to communicating findings. So, make sure you don’t show antipathy to any of the above. Instead, use this question to highlight your strengths. Just focus on the task you like performing the most and explain why it’s your favorite.
Example Answer
“If I had to select one step as a favorite, it would be analyzing the data. I enjoy developing a variety of hypotheses and searching for evidence to support or refute them. Sometimes, while following my analytical plan, I have stumbled upon interesting and unexpected learnings from the data. I believe there is always something to be learned from the data, whether big or small, that will help me in future analytical projects.”
11. What’s your knowledge of statistics and how have you used it in your work as a data analyst?
How to Answer
Data analysts should have basic statistics knowledge and experience. That means you should be comfortable with calculating mean, median and mode, as well as conducting significance testing. In addition, as a data analyst, you must be able to interpret the above in connection to the business. If a higher level of statistics is required, it will be listed in the job description.
Example Answer
“In my line of work, I’ve used basic statistics – mostly calculated the mean and standard variances, as well as significance testing. The latter helped me determine the statistical significance of measurement differences between two populations for a project. I’ve also determined the relationship between 2 variables in a data set, working with correlation coefficients.”
12. What scripting languages have you used in your projects as a data analyst? Which one you you’d say you like best?
How to Answer
Most large companies work with numerous scripting languages. So, a good command of more than one is definitely a plus. Nevertheless, if you aren’t well familiar with the main language used by the company you apply at, you can still make a good impression. Demonstrate enthusiasm to expand your knowledge, and point out that your fluency in other scripting languages gives you a solid foundation for learning new ones.
Example Answer
“I’m most confident in using SQL, since that’s the language I’ve worked with throughout my Data Analyst experience. I also have a basic understanding of Python and have recently enrolled in a Python Programming course to sharpen my skills. So far, I’ve discovered that my expertise in SQL helps me advance in Python with ease.”
By the way, if you’re finding this answer useful, consider sharing this article, so others can benefit from it, too. Helping fellow aspiring data analysts reach their goals is one of the things that make the data science community special.
13. How many years of SQL programming experience do you have? In your latest job, how many of your analytical projects involved using SQL?
How to Answer
SQL is considered as one of the easiest scripting languages to learn. So, if you want to be competitive on the job market as a Data Analyst, you should be able to demonstrate excellent command of SQL. Even if you don’t have many years of experience, highlight how your skills have improved with each new project.
Example Answer
“I’ve used SQL in at least 80% of my projects over a period of 5 years. Of course, I’ve also turned to other programming languages for the different phases of my projects. But, all in all, it’s SQL that I’ve utilized the most and consider the best for most of my data analyst tasks.”
14. Which Excel functions have you used on a regular basis so far? Can you describe in detail how you’ve used Excel as an analytical tool in your projects?
How to Answer
If you are an Excel expert, it would be difficult to list all the functions you have experience using. Instead, concentrate on highlighting the more difficult ones, particularly statistical functions. If you have experience utilizing the more challenging functions, hiring managers will presume you have experience using the more basic ones. Be sure to highlight your pivot table skills, as well as your ability to create graphs in Excel. If you have not attained these skills yet, it is worthwhile to invest in training to learn them.
If you’re an Excel pro, there is no need to recite each and every function you’ve used. Instead, highlight your advanced Excel skills, such as working with statistical functions, pivot tables, and graphs. Of course, if you lack the experience, it’s worth considering a specialized Excel training that will help you build a competitive skillset.
Example Answer
“I think I’ve used Excel every day of my data analyst career in every single phase of my analytical projects. For example, I’ve checked, cleaned, and analyzed data sets using Pivot tables. I’ve also turned to statistical functions to calculate standard deviations, correlation coefficients, and others. Not to mention that the Excel graphing function is great for developing visual summaries of the data. As a case in point, I’ve worked with raw data from external vendors in many customer satisfaction surveys. First, I’d use sort functions and pivot tables to ensure the data was clean and loaded properly. In the analysis phase, I’d segment the data with pivot tables and the statistical functions, if necessary. Finally, I’d build tables and graphs for efficient visual representation.”
15. What’s your experience in creating dashboards? Can you share what tools you’ve used for the purpose?
How to Answer
Dashboards are essential for managers, as they visually capture KPIs and metrics and help them track business goals. That said, data analysts are often involved in both building and updating dashboards. Some of the best tools for the purpose are Excel, Tableau, and Power BI (so make sure you’ve got a good command of those). When you talk about your experience, outline the types of data visualizations, and metrics you used in your dashboard.
Example Answer
“In my line of work. I’ve created dashboards related to customer analytics in both Power BI and Excel. That means I used marketing metrics, such as brand awareness, sales, and customer satisfaction. To visualize the data, I operated with pie charts, bar graphs, line graphs, and tables.”
Behavioral Data Analyst Interview Questions
To answer this type of data analyst interview questions with ease, you’ll need to take a walk down memory lane and recall details about how you handled specific challenges in your work with stakeholders, coworkers, or clients. Here’s what we have in mind:
16. As a data analyst, you’ll often work with stakeholders who lack technical background and a deeper understanding of data and databases. Have you ever been in a situation like this and how did you handle this challenge?
How to Answer
Data analysts often face the challenge of communicating findings to coworkers from different departments or senior management with limited understanding of data. This requires excellent skills in interpreting specific terms using non-technical language. Moreover, it also requires extra patience to listen to your coworkers’ questions and provide answers in an easy-to-digest way. Show the interviewer that you’re capable of working efficiently with people from different types of background who don’t speak your “language”.
Example Answer
“In my work with stakeholders, it often comes down to the same challenge – facing a question I don’t have the answer to, due to limitations of the gathered data or the structure of the database. In such cases, I analyze the available data to deliver answers to the most closely related questions. Then, I give the stakeholders a basic explanation of the current data limitations and propose the development of a project that would allow us to gather the unavailable data in the future. This shows them that I care about their needs and I’m willing to go the extra mile to provide them with what they need.”
17. Tell me about a time you and your team were surprised by the results of a project.
How to Answer
When starting an analysis, most data analysts have a rough prediction of the outcome rested on findings from previous projects. But there’s always room for surprise, and sometimes the results are completely unexpected. This question gives you a chance to talk about the types of analytical projects you’ve been involved in. Plus, it allows you to demonstrate your excitement about drawing new learnings from your projects. And don’t forget to mention the action you and the stakeholders took as a result of the unexpected outcome.
Example Answer
“While performing routine analysis of a customer database, I was completely surprised to discover a customer subsegment that the company could target with a new suitable product and a relevant message. That presented a great opportunity for additional revenue for the company by utilizing a subset of an existing customer base. Everyone on my team was pleasantly surprised and soon enough we began devising strategies with Product Development to address the needs of this newly discovered subsegment.”
18. Why do you think creativity is important for a data analyst? How have you used creative thinking in your work so far?
How to Answer
A data analyst is usually seen as a professional with a technical background and excellent math and statistical skills. However, even though creativity is not the first data analyst quality that comes to your mind, it’s still important in developing analytical plans and data visualizations, and even finding unorthodox solutions to data issues. That said, provide an answer with examples of your out-of-the-box way of thinking.
Example Answer
“I can say creativity can make all the difference in a data analyst’s work. In my personal experience, it has helped me find intriguing ways to present analysis results to clients. Moreover, it has helped me devise new data checks that identify issues resulting in anomalous results during data analysis.”
19. What are the most important skills a data analyst should possess to work efficiently with team members with various backgrounds, roles, and duties?
How to Answer
When answering this question, keep in mind that the hiring manager would like to hear something different than “communication skills”. Think of an approach you’ve used in your role as a data analyst to improve the quality of work in a cross-functional team.
Example Answer
“I think the role of a data analyst goes beyond explaining technical terms in a non-technical language. I always strive to gain a deeper understanding of the work of my colleagues, so I can bridge my explanation of statistical concepts to the specific parts of the business they deal with, and how these concepts relate to the tasks at hand they need to solve.”
20. In your opinion, which soft skills are essential for a data analyst and why?
How to Answer
Soft skills, a.k.a. non-technical skills are important for working efficiently with others and maintaining a high level of performance. As with most professions, data analysts should be aware of how their behavior and work habits affect the members on their team. Therefore, here you should base your answer on past work experience and highlight an important soft skill you have developed.
Example Answer
“I believe leadership skills are one of the major soft skills a data analyst should develop. The way I understand it, leadership means taking action to guide and help the members on your team. And this doesn’t necessarily mean you have to be in a managerial position. In my line of work, leadership would translate into providing expert insights regarding company data and its interpretation. That’s a skill I’ve worked hard to develop over the years. I can say being confident in my abilities has now established me as a leading figure in my area, and my team members know they can rely on my expertise.”
Brainteasers
Interviews for analytical and technical positions often include brainteasers that aim to evaluate how you apply logic, critical thinking, and creativity under pressure. Here’s an example:
21. A car travels a distance of 60 miles at an average speed of 30 miles per hour. How fast does the car need to travel on the way back (taking the same road) in order to average 40 miles per hour over the course of the entire trip?
You need to build the following equation:
The total distance that needs to be traveled both ways is 120 miles. The average speed that we need to obtain is 40 miles; therefore, the car must travel for 3 hours in order to achieve that:
120 miles/40 miles per hour = 3 hours
The car has already traveled for two hours:
60 miles/30 miles per hour = 2 hours
So, on the way back it needs to travel only 1 hour. The distance is 60 miles. Hence the car needs to travel at 60 miles per hour.
Guesstimate
Guesstimates can be critical in picking the right candidate for a data analyst job, as they assess your problem-solving abilities, confidence with numbers, and how you handle different scenarios.
22. What is the monthly profit of your favorite restaurant?
Pick a small family restaurant and not a chain of restaurants. This should make calculations much easier.
Then define the main parameters of the restaurant that we are talking about:
Days of the week in which the restaurant is open
Number of tables/seats
Average number of visitors:
– during lunchtime;
– at dinner;
Average expenditure:
– per client during lunch;
– per client during dinner.
The restaurant is open 6 days of the week (they are closed on Monday), which means that is open 25 times during lunch and dinner time per month. It is a small family restaurant with around 60 places. On average 30 customers visit the restaurant at lunch and 40 people come to have dinner. The typical lunch menu costs 10 euro, while dinner at this restaurant costs twice that amount – 20 euro. Therefore, they are able to achieve revenues of:
25 (days) * 30 (customers) * 10 (EUR) = 7,500 EUR (lunch)
25 (days) * 40 (customers) * 20 (EUR) = 20,000 EUR (dinner)
The restaurant is able to achieve 27,500 EUR of sales. Besides, the owner and his wife 4 people work there as well. Let’s say that the 3 waiters make 2,000 EUR each and the chef makes 3,000 EUR (including social security contributions). So the cost of personnel is 9,000 EUR. Usually, food and drinks cost around one-third of the overall amount of sales. Therefore the cost of goods sold amounts to 9,125 EUR. Utility and other expenses are another 10% of Sales, so we will have an additional cost of 2,750 EUR. The owners do not pay rent, because they own the place. After the calculations that we made, it results in a monthly profit of (before taxes) 6,625 EUR.
What can you expect from the data analyst interview process?
Phone screens, onsite interviews, number of teams who ask the data analyst interview questions… All of these vary depending on the company you’re applying at. Here are 3 real processes for data science positions you get a sense of how a data analyst interview goes down.
Netflix
Two detailed data analyst phone interviews – one with a recruiter, followed by another one with the hiring manager. There are also two onsite interviews. The first one is with about 4 people from the data analyst team. So, you can expect plenty of analytical, statistical (mostly A/B testing), and some SQL programming and stats principles questions. Most probably, you’ll be asked to analyze an assumed problem and identify key product management metrics. The second data analyst interview is with higher-level executives. Usually, questions are centered around the candidate’s background and professional experience.
LinkedIn
LinkedIn’s interview process for hiring data analysts doesn’t differ much from other companies. That said, there are phone screen interviews (expect some SQL and Python skills questions there), as well as 4 or 5 onsite data analyst interviews. About half of them focus on more advanced analytics questions, while the rest aim to assess your coding skills, and statistical knowledge (e.g. Simpson’s paradox). Something specific for the LinkedIn data analyst interview questions – a lot of them are product-related and require a product mindset and quick thinking. You may also encounter questions related to data applications and recommenders they use in their product.
Google
Google’s data analyst interview process is quite standard.
You’ve got one or several phone screen interviews, followed by onsite interviews. The first phone screen is usually centered around technical data analyst questions (some candidates share they were given an online SQL test, as well). By the way, Google has its own guide for the technical part of the interviewing process and you can check it out here. The onsite interviews are conducted by 4 to 6 people. All interviewers keep their notes confidential. Therefore, the possibility of bias in the data analyst interviewers’ feedback is down to a minimum. The next step is sending the written feedback to a hiring committee (something specific for Google). Finally, the committee makes a recommendation to Google executives for approval. Anything to keep in mind? The Google hiring process can take longer than expected. So, don’t be afraid to politely request a status update if a week has passed.
How to make a great impression during the data analyst interview?
Interviewing for a data analyst position may seem a bit stressful at first. So, just in case you still feel challenged in the confidence department, as a final takeaway, take a page out of our playbook. Here’s what we’re looking for when we’re hiring expert data analysts to develop the 365 courses:
Be a good listener, so pay attention to every single word in the data analyst interview questions;
Make sure your explanations are clear and reflect your thought process;
Well, even if your answers aren’t perfect and you need some help from the interviewer, you can still make it work. Being open to receiving help means you can handle feedback and tells the interviewer you’ll probably be a solid team-player;
Communication (both verbal and non-verbal) is key – exude a positive attitude, demonstrate professionalism and be confident in your abilities. Keep in mind your tone of voice and pacing, as well as your gestures. Your body language speaks volumes! That said, you can find more about the types of non-verbal communication and how to improve your body language in this Indeed article.
In Conclusion
Last but not least, if you didn’t land the data analyst job, learn from your experience. Try making mock data analyst interviews with a friend or a colleague. Include the challenging data analyst interview questions you couldn’t answer before and find a solution together. That will make you feel more self-assured next time you go to a data analyst job interview. To quote Mark Meloon,
“Chase fewer jobs but do a better job on them and do a post-mortem afterward so you can learn.”
That said, when you’re consistent and manage to stay organized in your data analyst job search, good things happen.
Ready for more data analyst interview questions? Follow the link to our really detailed article Data Science Interview Questions And Answers. In case you need to start building your skillset from the ground up, explore the all-around 365 Data Science Training. If you aren’t sure whether you want to turn your interest in data science into a dedicated career, we also offer a free preview version of the Data Science Program. You’ll receive 12 hours of beginner to advanced content for free. It’s a great way to see if the program fits your goals and needs.
  https://365datascience.com/data-analyst-interview-questions/ #Career
0 notes
raveenarenu-blog · 6 years
Text
Emerging Big Data Trends
Huge information market will be worth us$46.34 billion by end of 2018. this plainly demonstrates enormous information is in a steady period of development and advancement. idc gauges that the worldwide income from huge information will reach us$203 billion by 2020 and there will be near 440,000 major information related occupation jobs in the only us with just 300,000 talented experts to fill them. saying farewell to 2017 and just in the third month of 2018, we take a gander at the stamped contrasts in the enormous information space what energizing might be seemingly within easy reach for huge information in 2018. following huge information patterns is only like observing the customary moves in the breeze the minute you sense its bearing, it changes. however, the accompanying enormous information patterns are probably going to get down to business in 2018, Learn Big Data training in Chennai at Greens Technologys.
Tumblr media
1) Major information and open source
Forester conjecture write about huge information tech advertise uncovers that hadoop use is expanding 32.9% year on year. open source huge information systems like hadoop, start and others are overwhelming the enormous information space, and that incline is probably going to proceed in 2018. as indicated by the tdwi best practices report, hadoop for the venture by philip russom, 60% of the organizations intend to have hadoop bunches running underway by end of 2018. specialists say that in 2018, numerous associations will grow the utilization of huge information structures like hadoop, start and nosql innovations to quicken enormous information handling. organizations will enlist talented information specialists versed in instruments like hadoop and start with the goal that experts can access and react to information continuously through profitable business experiences.
2) Major information examination will incorporate perception models
A review of 2800 experienced bi experts in 2017 anticipated information disclosure and information perception would turn into a noteworthy pattern. information revelation currently isn't just about understanding the investigation and connections yet in addition speaks to methods for introducing the examination to uncover further business bits of knowledge. people have more noteworthy capacity to process visual examples adequately. convincing and charming perception models will turn into the decision for preparing huge informational collections making it a standout amongst the most huge enormous information drifts in 2018.
3) 2018 will be the time of gushing achievement
2018 will be the year when the objective of each association receiving huge information technique is accomplish genuine spilling investigation: the capacity to process and break down an informational collection while still it is currently creation. this implies gathering bits of knowledge which are actually up-to-the-second without repeating datasets. starting at now, this implies making a trade off with the measure of the dataset or enduring a deferral however by end of 2018 associations will be near evacuating these breaking points.
4) Meeting the "dull information" challenge in 2018
Notwithstanding all the promotion about the expanding information volume that we produce each day, it can't be denied that databases over the globe stay in simple shape, un-digitized, and consequently unexploited for any sort of business investigation. 2018 will see expanded digitization of the dull (information that isn't yet given something to do) put away as paper documents, authentic records, or some other non-computerized information recording positions. this new rush of dull information will enter the cloud. associations will grow enormous information arrangements that will enable them to move information effectively into hadoop from conditions which are generally exceptionally dull, for example, centralized servers.
5) AI and machine figuring out how to be quicker, more brilliant and more proficient in 2018
AI and machine learning innovation are developing at a lightning pace helping organizations change through different utilize cases, for example, ongoing advertisements, extortion location, design acknowledgment, voice acknowledgment, and so forth machine learning was among the best 10 key innovation slants in 2017 yet 2018 will observer it past guideline based convention calculations. machine learning calculations will turn out to be quicker and more precise helping endeavors make more suitable expectations.
These are only a portion of the best huge information slants that industry specialists anticipate, the constantly advancing nature of this area implies that we are probably going to expect a few amazements. enormous information is driving the innovative space towards a more splendid and upgraded future. with expanding number of associations bouncing on the enormous information fleeting trend, 2018 will be a significant year. here's to another incredible year of information driven creations, developments, and disclosures.
Most ideal approach to learn enormous information and hadoop
In spite of the fact that books and online instructional exercises are imperative channels to confer essential information of enormous information and hadoop, it is helpful to take up teacher drove huge information investigation courses to pick up a thorough comprehension of the innovation.
In a period of extreme rivalries and developing advances, the most ideal approach to sparkle over others is to have a strong base of the expertise in which you wish to manufacture your profession. teacher drove huge information hadoop web based preparing gives hands-on understanding to the members to have a superior hold on the innovation.
There are a few preparing and improvement foundations that give on the web, live, virtual and classroom preparing for experts. you can likewise decide on huge information accreditation, which will enable you to separate yourself among the crowd.
Many preparing associations orchestrate small scale session preparing modules that are structured particularly to suit the preparation needs of the members.
Our specialists propose picking educator drove web based preparing for enormous information investigation courses to get the essentials cleared up and afterward you can pick among different online instructional exercises and books to fortify your base and upgrade your insight into huge information and hadoop. different ivy group instructive foundations like harvard and stanford have likewise made their seminars on enormous information and information sciences accessible for nothing, on the web.
who ought to learn huge information?
There are no pre-characterized parameters to assign who can learn enormous information examination and who can't. in any case, you should realize that employments identified with huge information are a blend of arithmetic (measurements) and software engineering. in this way, a man with quantitative fitness and an essential comprehension of PC programming would be appropriately reasonable to learn huge information.
 To choose whether you should prepare yourself in huge information and hadoop or not, you should comprehend what comprises a major information work.
To be exact, there is no significant set of working responsibilities for this. it is fundamentally a different exhibit of positions here and there and over an association that needs information wise experts who can oversee, process and draw significant bits of knowledge from the flood of information being amassed in the association.
Numerous huge information and hadoop courses expect students to have involvement in java or essential learning of c++, python, and linux. be that as it may, it's anything but a standard set in stone.
Try not to have any thought of java or linux? no stresses, you can even now burrow your hands on enormous information and hadoop.
As a rule, huge information examination courses are intended for framework engineers, programming designers, modelers, information stockpiling directors, venture chiefs, bi experts and testing experts.
Requirements for learning hadoop
Hadoop is composed in java and it keeps running on linux. thusly, it is very basic that a man who needs to learn hadoop must have the information of java and ought to be comfortable with the directions in linux.
Having said that, given me a chance to stick to what I have said before; there are no strict essentials to learn hadoop. hadoop is only one of the structures utilized in huge information. what's more, obviously, it is being utilized a ton. also, indeed, it very well may be securely named as one of the fundamental parts of enormous information. yet, there are a few unique instruments and innovations other than hadoop that are utilized to oversee and investigate enormous information.
Devices, for example, hive and pig that are based over Hadoop, don't require nature with java and offer their own abnormal state dialects for working with information which are then naturally changed over into map reduce programs in java.
Be that as it may, it is constantly favourable to know the fundamental ideas of java and Linux as it helps in acing the aptitudes better. information ofsql is useful for learning Hadoop in light of the fact that hive, pig, hbase, these rely upon question dialect which is propelled by sql.
Distinctive preparing suppliers state diverse requirements relying upon the course module, its extension, and the structure.
To spell it in strict terms, the requirements to learn enormous information and Hadoop are:
Numerical inclination
All-encompassing comprehension of PC design, systems, document framework, circulated PC association, information incorporation procedures
Capacity to measurably break down data
Comprehension of the field and the market for which the enormous information and hadoop benefit is required
It will be out of line to announce who can and who can't work with enormous information and hadoop, or all things considered with any huge information innovation. it is an occurrence field which is rising each day and the greater part of us can lay our hands and contribute towards its development and improvement.
Big Data @ Greens Technologys
If you are seeking to get a good Big Data training in Chennai, then Greens Technologys should be the first and the foremost option.
We are named as the best training institute in Chennai for providing the IT related training. Greens Technologys is already having an eminent name in Chennai for providing the best software courses training.
We have more than 115 courses for you. We offer both online and physical training along with the flexible timings so as to ease the things for you.
0 notes
technicaldr · 6 years
Text
Up, down or out? Perspectives on artificial intelligence in health care today
Two-thirds of the attendees polled at a recent innovation summit by The Economist agreed on one thing: health care is the sector that will benefit most from artificial intelligence (AI) technologies.
  In health care, which is in the midst of an industry transformation and a digitalization of key aspects of patient engagement and care management, the role of data, analytics and AI are central to the organizational mission. However, it is easy to get caught up in one aspect or another when extolling (or decrying) the role of AI, while ignoring the near-term potential as well as the limitations of the technology.
  How can AI play a role in health care today? In the words of S. Somasegar, a venture capitalist, there are three ways in which AI can impact a business today. Upwards, meaning that AI can take on intelligent capabilities that enable a higher level of interaction with humans; downwards, implying an ability to reduce costs; and outwards, which is to take AI to the edges of our computing infrastructure.
  Voice recognition and natural language processing (NLP) technologies help us move up in health care by enabling remote-monitoring and home health care through a “natural” interface with humans. In the health care enterprise, NLP technologies can “read” complex medical literature and provide doctors and clinicians with intelligent choices for diagnosis and treatment options.
  With the emergence of cheap computing and storage infrastructure, AI technologies help manage vast arrays of servers and networking equipment, detecting and remediating the most common problems without human intervention. “Purpose-built” hardware with inbuilt AI capabilities are becoming the norm in high-volume and time-sensitive operations that require running machine-learning algorithms on large data sets and doing it at low costs.
  The notion of edge computing, a paradigm that takes analytics and AI to the edges of a computing infrastructure, has lately become important in the context of the Internet of Things (IoT) and smart devices. In health care, the proliferation of intelligent devices, in and out of hospital settings, has created many new opportunities. Tom Bianculli, Chief Technology Officer of Zebra Technologies, a firm that provides mobile devices, scanners and RFID-enabled tags used in hospital environments, talks about “digital diaries” that can log every minute and every second of a device’s operation in the context of patient care. Using a network of tags and near-field communication equipment, Bianculli is now able to track a mobile device in a caregiver’s hands as she makes her way through a hospital floor, recording and analyzing everything from her precise location to her pace of walking to the direction in which she is headed with the device. Extending it to outpatient or even home health care, the deployment of intelligent devices that can analyze data at the “end point” and sending it back to a back-end system can save lives by reducing the time involved in alerting caregivers to medical emergencies.
  To some, all of this may sound futuristic. However, it doesn’t have to be complex use cases and high risk situations involving patient lives that determine whether AI is suitable for a health care institution. The vast majority of AI use cases involve “low-hanging fruit” that automates aspects of operations that are routine and repetitive in nature. AI can release humans from mundane tasks and enable them to work on more exciting and value-added tasks. In some industries with an acute shortage of skilled human resources such as health care, this may even be a necessity for long-term sustainability. 
  The use of AI technologies comes with responsibilities as well. In the wake of recent disturbing news about a driverless car causing a fatal accident and the alleged misuse of Facebook profile data to influence the last presidential elections, there was a somber tone to the discussion at The Economist event. The gathering of AI technologists and industry leaders using AI to advance their business goals paused to reflect on how AI can be force for good and bad. Among the concerns: AI technologies by themselves may not reveal any inherent biases, but may unleash all manner of biases that reflect the biases of the humans who design the systems. There is a growing sense that AI should be used not just for the right predictions, but also to make predictions for the right reasons. While AI is coming on par with humans in aspects such as reading radiology images, the same neural network algorithms have potential for discriminatory profiling based on facial recognition and other decisions that have implications for society. The usefulness of AI models also depends on the data sets: as an example, selective representation of demographic profiles in a data set can give rise to biased conclusions on populations represented by that dataset.
  The underpinning of success with AI lies in the underlying data. Fortune 500 companies are spending up to 50 percent or more of their IT budgets on information integration today, and no sector is more acutely aware of this than health care, with its complex environment of proprietary electronic health record (EHR) systems and emerging data sources. Unlike in other sectors such as consumer finance and retailing which are long used to multi-channel engagement with customers based on an omni-data capability that can aggregate and integrate data from a wide variety of sources, health care remains more siloed today than any other sector. The implications for AI adoption are clear: it will be slower than in other sectors.
  Finally, having the data and the AI capability doesn’t ensure improved quality or reduced costs in health care. You need intervention models in place to do something with the data and have care plans for doing the preventive intervention, which can be challenging if the data is incomplete (as often the case with EHR data) or outdated (as with health insurance claims data). In an era of high-volume and high-velocity real-time data, these limitations will restrain the adoption of AI technologies.
  As computing costs drop and AI technologies mature, health care and other industries will have to invest and catch up or get left behind in the great digital transformation under way. As someone said to me, there is a penalty for inaction. That penalty may be too big a cost to pay for most enterprises today. 
Technical Dr. Inc.'s insight:
Contact Details :
[email protected] or 877-910-0004 www.technicaldr.com
0 notes
the-fitsquad · 6 years
Text
Laptop And Telecommunications Solutions
Standing Laptop Workstation
AirClean® Systems offers two varieties of PCR Workstations, each common with UVTect microprocessor controller, that aid the researcher by stopping background and cross-contamination. A two-tier 19″ cabinet developed for installing the central unit of the technique or other kinds of electronic devices. HP’s reinvention of detachable PCs started earlier this year with the introduction of the HP Spectre x2 and the HP Elite x2. Today’s introduction of the HP ZBook x2 completes HP’s trifecta with its most effective detachable resolution targeted for the inventive neighborhood. Powered by Intel’s newest Sandy Bridge-primarily based XEON processor family members, the E5-2600, HP’s new Z series machines advantage from the CPU’s integrated memory controllers, Hyper-Threading and Turbo Boost technology. In a previous organization there had been 6 CAD2 badged workstations and the supplier was on site the next day as there have been significant problems with the machines crashing using PDMworks.
That CPU upgrade also brings the type of perks normally restricted to desktops, such as support for a whopping 64GB of memory and Thunderbolt three ports that can handle dual 4K displays. But Cinema 4D is one particular of the pieces of software program that is most closely linked with specialist-good quality 3D design and style perform. Ahead of we take a look at the HP workstations, it’s critical to first think about what is essential for 4K video editing. HP worked straight with partners like Adobe and Avid to create computer software that’s optimized specifically to take benefit of HP hardware and the NVIDIA CUDA platform to wring even a lot more powerhouse performance out of the GPU. Based on the newest Intel® Core two Duo, Dual & Quad Processors. I haven’t built a box from scratch but, but I’ve been upgrading, re-purposing and modifying hardware and operating systems for my specific workflow for lengthy time.
Both hardware approaches give powerful security for administrative accounts against credential theft and reuse. Our properly-created screens and partition panels aid the ‘think’ tank desk configuration, retain privacy and lessen noise in a fashionable way utilizing our desk based partition screens Workplace workstation screens come in a assortment of different colours and dimensions generating for a trendy, groovy and functional office desk workstation. Now the idea behind this build is that it really is an all goal workstation that is not necessarily application particular however has plenty of area for upgrades depending on application or if extra functionality is necessary down the line. The ZDesigner LP 2844 Driver works completely with both the LP 2844 and the ZP 450 thermal printers.
If you do not see the Public Access Station, End-of-Variety PAC Station, On-Line Monitor Stand or Pc Workstation you are seeking for, please get in touch with our friendly Client Service Employees for help at 800-548-7204. ~ Gorgeous desks for house workplace spaces assist generate much more than just workspaces. The details that would truly make this expertise effective for me is to learn how to market my enterprise, to develop robust client relationships, and to set my own requirements of operating a business so I can stay true to my beliefs and get the appropriate sort of client. Our in-property CAD and workplace design and style group give the initial step in generating the perfect workplace match out solutions for your space and folks needs. This bundle involves a 256GB PCIe SSD, which makes it possible for this mobile workstation to begin Windows and applications amazingly quick.
We’ll be focusing on PCs that run Windows operating systems, rather than Linus or MacOS, due to the method specifications of a lot of popular CAD applications. With our control space workstation you can monitor all methods in production, guaranteeing both the good quality and compliance of the merchandise. Back in 2009, the initial Z800 sported a pair of quad-core Xeon CPUs operating at three.2GHz, and 18GB of DDR3 memory while the refreshed Z800 , released a year later, featured a pair of six-core CPUs, running at three.33GHz. Comparatively less costly, workstations are fundamentally more for official jobs such as emailing, document creation and presentation, internet browsing and other organization-related performances. This write-up focuses primarily on the top end models—the Z420, Z620, and Z820—and will examine the benefits that HP workstations offer and how to configure them for 4K post-production function.
Currently, Sun Microsystems companies the only workstations, which use x86-64 microprocessors and Windows, Mac OS X, Solaris 10 and Linux-distributed operating systems. If you can afford it, possessing several SSDs (1 for the OS and AutoCAD and a second committed for active projects) along with a larger traditional drive for storage is even much better. If a buyer with an older machine wants a problem fixed, HP can usually go back to this vault and uncover the solution to investigation and solve the problem. There will be ‘cheaper’ possibilities in the Z6 and Z4. The Z6 caps the memory at 384GB with the dual Xeon CPU. Like all HP workstations, the Z620 is made to satisfy the ever-increasing demands of pros who work with huge and complicated datasets, complex 3D models, require a number of displays, or for organizations that just need a higher-functionality and reputable system for their personnel.
Figure 12. A computer model with a number of supported graphics cards to decide on from. On December three, 1999, a 40-year-old tower-painting-firm owner, his 16-year-old stepson, and a 19-year-old employee died following falling 1,200 feet when the hoist line on a portable capstan hoist used to raise them up the side of a 1,500-foot-higher radio tower started slipping around the capstan, causing the hoist operator to lose handle of the hoist line. Powered by Intel® Xeon® processors and NVIDIA® Quadro® graphics, this dual-processor workstation is 1 tough performer. Prepared for the most demanding design and style applications, the Broadberry CyberStation Graphics® variety can be configured with up to 1.5TB of memory. You can add users to the RealNex CRM either from Database Manager on the server Pc, or from inside the RealNex CRM Desktop application even though logged in as an Administrator.
Numerous of the desks we discuss have designated spots for personal computer towers, keyboards and mice, and shelves for office supplies. Today, these nations, specially the United states, invest big sums of capital which aid advance the technology and productivity of so-referred to as ‘backward societies’. It is to be surrendered this plotting out social gathering is in the furnishings enterprise for far more than 10 years and have been providing shocking high quality and top quality issues and satisfying our regarded consumers other than. Mobile personal computer desk stations are revolutionary and hugely ergonomic operate solutions for your residence workplace or even your corporate job. Want to visualize what these inventive workstations and private offices will look like in distinct colors?
Intel’s C612 chipset can help dual, subsequent generation Intel Xeon E5-2600 v3 or E5-2600 v4 series processors. CELSIUS H770: a mobile workstation with NVIDIA Quadro graphics for a 4K resolution display along with the newest mobile Intel Xeon processors and Fujitsu’s special PalmSecure technologies. There are lots of firms, who are now manufacturing various kinds of modular furniture and furnishing products, which are being employed by the interior decorator or furnishing contractors to help for beginners their respective client to get the maximum utilization of the available space and also the corporate look of their workplace. Nonetheless, I do inform men and women of the program requirements and how the distinct pieces of hardware match in with Autodesk application and what minimum specs I advocate.
Although the end-outcome (the video) is visual”, video encoding is a computationally intensive, traditionally CPU-based application in the way that CFD or FEA simulation applications are computationally intensive and traditionally CPU-based applications. The evening auditor ought to also be familiar with the nature of cash transaction affecting the front office accounting system. Ergonomic and versatile contemporary office furniture such as height adjustable tables let the staff to sit and stand whilst functioning as per their comfort. HP performs with each national retailers and specialized neighborhood resellers to bring our workstations to you with excellent client service. Of course, like most dual processor workstations, the burden of two high-end Xeons weighs heavy with the all round cost of the machine coming in at £5,620.
For instance, integrating CAD technologies with Personal computer Numerical Handle (CNC) machines or Additive Manufacturing processes (3D printers) e.g. Fused Deposition Modeling (FDM) machines. The user interface of VMware Workstation has been created to a workspace connected with clear menus, live thumbnails and a library of robust virtual machine. For AutoCAD 2017, for example, numerous certified machines feature processing power of at least three GHz, with RAM usually 16 or 32 GB rather than the suggested 8—with numerous machines featuring up to 64 GB. This ought to aid you to come up with a much better estimate for what your machine ought to incorporate. This Game Boy-like device attributes a 1.two-megapixel with Sony Mavica optics, headphone jack, microphone jack, an Intel processor, and even a trackball.
The two graphics cards I am proposing each have 512 cores of GPU processing power and 3 Gb or GDDR5 memory. This operation is not allowed on an invalid disk. DTK (Datatech Enterprises): Press Esc Important Correct Following Powering On the Computer. The Computer software Licensing Service reported that the product key can’t be employed for this type of activation. It enables an HP workstation with a single AMD FirePro graphics card to drive up to six person displays. An additional aspect to think about is that diverse CAD application also has different hardware configuration specifications, so assuming you are in agreement that a appropriate CAD workstation is a sound organization investment and not just an additional laptop, in the next section we will appear at matching your Cad workstation to your software and why this is also critical also.
IN NO Event SHALL EQUATOR BE LIABLE Regardless of whether IN CONTRACT, WARRANTY, TORT (Like NEGLIGENCE (Regardless of whether ACTIVE, PASSIVE, OR IMPUTED)), Solution LIABILITY OR STRICT LIABILITY OR OTHER THEORY, FOR ANY DIRECT, INDIRECT, INCIDENTAL, Particular, PUNITIVE OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER Such as With no LIMITATION ANY LOSS OF Content material(S), LOSS OF Data, LOSS OF USE, LOSS OF Earnings, LOSS OF SERVICE, Laptop FAILURE, THE PROVISION OR FAILURE TO Give Solutions, Safety OF THE Internet site(S), UNAVAILABILITY OF THE Internet site(S), THE DELETION OF ANY AND ALL Information AND Content material STORED IN YOUR ACCOUNT OR PECUNIARY LOSS) ARISING OUT OF OR IN ANY WAY CONNECTED TO THE USE, Overall performance OR INABILITY TO USE THE Internet site(S) OR Associated Solutions Like ANY Content, Software program, Products AND Services Created Accessible OR OBTAINED By means of THE USE OF THE Web sites, AND ALL OTHER Uses OF THE Internet sites, WHICH ARE Completed AT YOUR Personal DISCRETION AND Threat.
A handful of years ago, we detailed how to set up a multi-monitor workstation and how to use a number of displays along with a laptop Here we concentrate on recent developments that can assist you program and deploy a modern day, multi-monitor workspace. HP’s Z620 Workstation excels in these high-performance scenarios due to its multitude of processor and memory options as nicely as an extensive list of GPUs for quite specific configurability from entry level 3D with the NVIDIA Quadro 410, NVIDIA Quadro K600 and FirePro V3900, to ultra higher-end GPU computing cards with the NVIDIA Tesla K20c and K40, the Z620 can be customized for variety of use situations in several various industries. It is supported by Dell Technical Help Precision Workstation T1700.,Dell is a direct partner to firms and shoppers that delivers revolutionary technology and solutions.,The Dell T3400 Workstation is a shining instance of what a dependable pc need to be!, overview by rgathright.
This versatile series of desks let you to customize the perfect desk for your space. An HP StorageWorks Options Specialist collaborates with you to architect solutions that place your information to operate, and he enables your business to grow and adapt to modify. For existing buyers only HardSoft offer you leasing more than Two years on the Apple iPhone 8, eight Plus, and X. With choices on assistance as nicely as a choice of O2 Enterprise tariffs. Each and every DDR3 module is developed to operate at 1866MHz and will supply every of our Xeon E5-2670 processors with quad-channel memory. LA Micro UK gives the lowest prices and the quickest lead times on HP Workstations, such as the Z620. Supply your workers with the best workstations feasible through our choice of sturdy, properly-made office desks.
With remote access they can view all applications, documents and monitor their business from their mobile workstation. Pop-up power and information stations are a need to if your staff use laptops, desktops or any other devices that demand sustainable energy supply to operate for prolonged periods. HP recently added some new members to its loved ones of workstations and mobile machines and took a lot of the guesswork out of acquiring each customer- and skilled-level computer systems, and gives possibilities with a dizzying array of configurations that are certain to uncover a spot in your house or office. Adrian Arnott, African Life’s outgoing chairman, speaks to the possibilities of tis method when he was quoted as noting that, “This is a new sort of alliance among enterprise and the black community that I’d like to think is a strategic coup” (WSJ, ibid).
If you have a quad core processor clocked at 3 GHZ and a quad core processor clocked at 2.5 GHZ and each are rendering the same point, the three GHZ will finish faster simply because it has 500 far more MHZ of processing speed across its 4 cores. Guests who present an acceptable credit card at registration might be extended a line of credit equal to the floor limit credit limit set by the issuing credit card company. Supplying assistance for up to eight displays and robust OpenGL support, NVIDIA® graphics are totally certified and optimized with business-leading applications from AutoDesk, Solidworks, Adobe,® and numerous others for enhanced functionality and reliability. On prime of that, it comes with a 512GB PCIe SSD which is one of the fastest storage drives available, greatly improving responsiveness, reducing booting speed and program launch time.
You may possibly be surprised at the level of style and sophistication in office designer furniture these days. The design of the desk is a bit various from a classic desk it’s a extremely modern design and style that would suit the workplace or home workplace décor of a true Mac fan. This operation completed, but no trust anchors had been added since all of the trust anchors received had been either invalid, unsupported, expired, or would not turn out to be valid in much less than 30 days. Massive data sets, simulations, MATLAB to Wafer Fab are no problem with 44 cores and up to 1TB of 2400MHz ECC memory. Whilst building furniture plans with our consumers and brainstorming numerous design and style ideas and layout possibilities, it is occasionally difficult for them to visualize what these color and configuration combinations will appear like and how they will meld with their workplace interior.
from KelsusIT.com – Refurbished laptops, desktop computers , servers http://bit.ly/2ylDoo9 via IFTTT
0 notes
lindyhunt · 6 years
Text
How to Make Competitive Intelligence Your Competitive Advantage
I stared at the sea of snow around me. White as far as the eye could reach. A blank, empty landscape.
It was exhilarating at first. No one ahead, no one tailing or overtaking me.
Unencumbered, I was completely alone.
Then I realized I was lost. What direction was I going in? Was I going in circles? Was I even on a path?
It was my first snowboarding trip in Japan and I had taken a wrong turn. And now, the beautiful landscape around me was devoid of anyone, or anything, that might give me a sense of direction.
Years later, it’s this experience that comes to mind when my friends in the startup industry complain about their competitors.
They long for a blank expanse of a market, free of competitors hurling their way forward alongside (or ahead, or behind) them. It’s an entrepreneur’s daydream: no one to stop you from speeding ahead, eyes firmly on the prize.
Of course, markets devoid of competitors are rare: if a product is worth making, the market around it won’t stay empty for long. If it does, the product probably wasn’t worth making.
Even so, the appeal remains. And that’s why we’re often told to ignore our competitors completely.
Shut them out, focus single-mindedly on your product and users: everything else will fall into place.
And it’s certainly true that an obsession with ‘the competition’ is a bad idea. When companies get tangled up in rivalries, they often miss out on more important things (while Coke and Pepsi were battling it out, they failed to cash in on the arrival of energy drinks — Red Bull is now the most popular soft drink in the world).
But do we have to choose between a blinkered focus or competitor obsession?
I don’t think so. If we can adjust our lens, the landscape around us will begin to look different. Blinkered focus can turn into a peripheral vision that is both inclusive and alive with information.
A single company — especially a startup — will naturally be limited in terms of the information it relies on. The combined knowledge of a number of companies operating in the same arena (or neighboring landscapes) is inevitably much more extensive.
And so, we can begin to view competitors as invaluable resources rather than pesky rivals.
Here’s how.
The Perks of a Busy Market
Many startups enter an already-crowded market. That’s often seen as daunting, or even deal-breakingly negative.
It’s not.
In fact, entering an unexplored arena with no context is much more daunting.
The support that comes with having competitors may not be obvious at first, but it’s real.
By their mere presence, a competitor is already helping you carve out your category, make noise, and educate your users. They are legitimizing your idea, and proving there’s a market for it.
As Joel Gascoigne, CEO of Buffer puts it:
The real problem startups have is that most people don’t know about them.
Competition can help to shine light on the market, which is often actually more useful than if you were alone.
There are cautionary tales of businesses that became ‘victims of active inertia’ — lazy on their initial strong competitor-free success — and ultimately failed. And there are other businesses that, spurred on by new competitors, kicked their complacency habit to the curb.
That’s because strong competition is the ultimate accelerator. Each individual step forward counts as a step for everyone.
Would Nike still be the king of sportswear without Adidas snapping at their trainered-heels? What would Steve Jobs’ legacy be without Bill Gates? Would Salesforce’s boom be as resounding without Oracle pushing it along?
Competition strengthens, rather than weakens, a market. Companies striving for leadership and out-servicing each other to win customers generates tons of value. Plus, saturated markets are the most profitable. And if your business can thrive in a startup-crowded jungle, it can survive (almost) anything.
Peripheral Vision
The benefits of a competitive market aren’t reaped automatically. What you need is an all-encompassing awareness of the industry you’re in — its peaks and troughs, new players and old-timers, innovations and disasters.
This will keep you nimble and agile, particularly if, like me, you’re in a market that’s constantly shape-shifting and whizzing forward.
With peripheral vision of the big picture, you can steer fast and respond faster. And when you adopt this frame of mind, the competitive landscape turns into a goldmine of information to guide you forward.
But how do we tap into this competitive intelligence?
It boils down to three basic steps:
Gather data from a bunch of different websites
Analyze the data
Interpret the results
Of course, we do this kind of stuff all the time. The problem is, it tends to be a tedious, time-consuming process woven with endless opportunities for distraction — particularly as half of the relevant information will be buried under the noise of the internet.
And really, there are a ton of better ways we could be spending our work day. Humans aren’t great at dealing with large datasets. Machines are.
Enter bots — clever automated systems. By linking to, and communicating with, multiple APIs (Application Programming Interfaces), machines like our new digital assistant, GrowthBot, can help us map the landscape we’re operating in.
But the potential of automation goes further than this. Our capabilities as humans are finite, and often biased. As important as it is to stay on top of our industry’s trends and transformations, we’ll never be able to know it all ourselves.
That’s because we rely on the limited amount of information we’ve come across rather than a complete knowledge of our field (which would be impossible). Plus, we can’t plug in a new server when we reach our information-processing limits.
A smart machine, however, will be able to learn literally everything there is to know about an industry and its players.
It can then summarize and present the core content back to us, bringing us as close to being all-knowing, all-seeing experts as it’s possible to be.
Reducing Trial and Error
Here’s the thing: all businesses are testing the waters for someone else. It’s up to those who follow in their footsteps to capitalize on this.
By paying close attention, a trailing company can cleverly reap the benefits of someone else’s successes and failures.
That’s not as evil as it sounds. It’s basically common sense, or the ‘second-mover advantage’.
While the pioneer pays a steep price in creating the product category, the later entrant can learn from the experience of the pioneer, enjoying lower costs and making fewer mistakes as a result.
Imagine if Google hadn’t paid attention to the mistakes in Yahoo’s UI, or if Uber hadn’t exploited the public enthusiasm for Lyft?
Indeed, many of the world’s most successful businesses aren’t built on originality. Instead, they’re launched by entrepreneurs who simply saw a gap in an existing concept. Perhaps that gap was better branding, smarter customer service or elevated UX.
One example is the way Apple redefined and dominated the market for mobiles — a category which Motorola pioneered.
The point is, this success is built on the swift, agile following — and then improving — of an idea; not on being the first to execute it.
Similarly, you can come to a conclusion in seconds (for free) that it might have taken another business hundreds of hours and thousands of dollars to get to.
There are shortcuts everywhere when you start looking. Opportunities to reduce trial and error on the basis of someone else’s experience.
Maybe similar strategies will work for you, maybe they won’t. Either way, the more you know, the better-informed your future decisions will be.
Put simply: insights aren’t necessarily less useful when they’re second-hand.
Collective (Competitive) Intelligence
Collective intelligence within an industry is something we can all contribute to and benefit from, by pooling resources and building on what’s already been achieved.
Turning a potential battleground into a network, like in case of open-source software.
And along the road somewhere, your competitor could even become your collaborator: enter ‘coopetition’ (cooperative competition).
The big players have set the trend: Spot.IM has started using Facebook’s API, despite the fact that their goal is to keep users engaged on websites that aren’t Facebook. Apple and Microsoft joined forces on the licensing of mobile operating system features and patents. Google funded Mozilla’s Firefox. Arch-rivals Amazon and Apple teamed up to distribute Amazon’s e-books through the iPad’s Kindle app.
All these companies have the same target market, but they found that combining forces — rather than constantly trying to one-up each other — is a powerful tool. And this mindset can be put into practice on a smaller scale, too.
As long as we’re keen to grow, improve and adapt, there will always be value in questioning the tactics of businesses besides ourselves. Maybe the answers will surprise us, maybe they will inspire change.
Businesses like ours, yes; but also unrelated businesses we simply find interesting.
So no, it’s not about an unhealthy obsession with our competitors. But it’s not about shutting out those around us to create the mirage of an empty landscape, either.
It’s about curiosity, open-mindedness, a willingness to learn from others. And with the help of today’s AI, competitive intelligence can become a competitive advantage.
0 notes