Полное совпадение, включая падежи, без учёта регистра

Искать в:

Можно использовать скобки, & («и»), | («или») и ! («не»). Например, Моделирование & !Гриндер

Где искать
Журналы

Если галочки не стоят — только metapractice

Автор
Показаны записи 9071 - 9080 из 56266
http://metapractice.livejournal.com/460792.html
Еще одна ассоциация связи техник(или усиление). Бендлеровский ГенераторНовогоПоведения это по сути техника импринтирования, потому что в технике ГНП есть:
- ролевое моделирование;
- фиксация убеждением;
- мощная кинестетика.
(сразу скажу, что технику не читал в оригинальном Бендлеровском описании - просто не нашел в сети. Но думаю, что ээээ оригинальное исполнение имеет в наличии этих три обязательных шага. Смешно звучит, но что делать). В Российском сегменте нлп намеков на эти обязательные шаги нет. Кто во что горазд.
офф. Если у метапрактиков есть техника ГенераторНовогоПоведения в описании Бендлера, то прошу - поделитесь, буду благодарен.
http://metapractice.livejournal.com/492280.html?thread=12687608#t12687608



New Behavior Generator Strategy


http://www.nlpu.com/Patterns/patt16.htm


One of the most essential processes of change is that of moving from a dream or vision to action. NLP has developed a kind of 'all purpose' creativity strategy, organized around the process of moving from vision to action, called 'The New Behavior Generator'. The basic steps to the New Behavior Generator were set out by John Grinder in the late 1970's.

The New Behavior Generator is an elegant strategy that can be applied to almost any situation that involves personal flexibility. The basic steps involve forming a visual image of a desired behavior, kinesthetically associating into the image on a feeling level, and verbalizing any missing or needed elements.





New Behavior Generator T.O.T.E.


These three steps form a feedback loop in which vision and action interact through the intermediate processes of emotion and communication.

The goal of the New Behavior Generator is to go through a type of mental 'dress rehearsal' by generating imaginary scenarios and bringing them to concrete actions by connecting the images to the kinesthetic representational system. The strategy is based on several key beliefs:


  1. People learn new behaviors by creating new mental maps in their brains.

  2. The more complete you make your mental maps, the more likely you will be to achieve the new behavior you want.

  3. Focusing on your goal is the quickest way to achieve new behaviors.

  4. People already have the mental resources they need to achieve new behaviors. Success is a function of accessing and organizing what is already there.

The New Behavior Generator is a 'How To' process that both expresses and supports these beliefs through the process of acting "as if". Like all NLP strategies, the New Behavior Generator follows a particular cognitive sequence, made up of processes involving the various sensory representational systems. Each step in the sequence is also supported by behavioral cues in the form of eye movements. These eye positions help to focus and stabilize the particular representational system to be accessed.

Basic Steps of the New Behavior Generator Strategy

The basic steps of the New Behavior Generator Strategy involve:


  1. Asking yourself, "If I could already achieve my new goal, what would I look like?"
    (Do this while putting your eyes down and to your left.)

  2. Picturing yourself achieving your goal. (Look up and to your right to help stimulate your imagination.)

  3. To help you visualize:

    1. Remember a similar successful achievement.

    2. Model someone else.

    3. Picture yourself first achieving a smaller part of the goal.
      (Move your eyes up and to the left or right.)


  4. Stepping into the picture so you feel yourself doing what you pictured.
    (Put your eyes and head down and to the right as you get into the feeling.)

  5. Comparing these feelings to feelings from a similar past success.
    (Keep your eyes and head turned down and to the right.)

  6. If the feelings are not the same, name what you need and add it to your goal. Go back to step 1 and repeat the process with your expanded goal.
    (Move your eyes and head down and to the left.)




Sequence of Accessing Cues for
New Behavior Generator Strategy



Detailed Description of the Steps of the New Behavior Generator Strategy

Each of the basic steps of the New Behavior Generator Strategy can be done with precision and rigor in order to enhance its practical effects. The following is a detailed description of each step in the strategy:


  1. Say to yourself, "If I was already able to...(state your goal)...what would I look like?" (Ad)

  2. Construct a visual image of what you would look like if you were in the act of achieving the full goal you have just stated. You should be seeing yourself in this image from a disassociated point of view as if you were above or next to yourself looking at yourself. (Vc)

  3. If you have trouble coming up with a clear image of yourself, use one of the following strategies:

    1. Chunk down your goal into smaller steps. Ask yourself, "Is there any portion of my goal that I can see myself achieving?" for instance, "Can I see myself accomplishing the first step of my stated goal?" Visualize yourself successfully achieving that smaller part of your goal. (Ad_>Vc)

    2. Use an image of yourself from a similar successful situation. Ask yourself, "Is there something similar to my goal that I can already achieve?" Visualize what you do in that situation and edit or modify the image to fit your current goal. (Ad_>Vr_>Vc)

    3. Model someone else. Ask yourself, "Who do I know that is already able to fully achieve the goal I have stated?" Visualize the what this other person does to be successful. Then visualize yourself doing what you just saw your model doing. (Ad_>Vr_>Vc)


  4. Mentally step inside of the image you created of yourself achieving your goal so that you feel as though you are doing right now what you just saw yourself doing. What would you be seeing, hearing and feeling? (Vc_>Kc)

  5. Compare the feelings you have as you put yourself fully into that experience with the feelings you identified earlier from a similar experience in which you are already successful. (Kc/Kr)

  6. Decision Point

    1. If the two feelings match exactly so that you feel as confident that you can achieve your new goal as easily as the goal you have already achieved successfully, then you are done.

    2. If the two feelings do not match then name what is missing or what is needed (i.e., "creativity," "more confidence," "be more relaxed," etc.).


  7. Apply the same rule to this statement of the needed resource that you applied to your initial goal statement. That is, state it positively. For example, if your statement of what is needed is "be less nervous," ask yourself, "If I could be less nervous what would I be doing instead?" (Ad)

  8. Refine your goal by taking the name of the needed resource that you have identified and adding it to your goal statement by simply connecting it with the word "and." For example, the goal statement may now be something like, "I want to be more assertive with my co-workers (initial goal statement) AND keep in mind their feelings as well." Go back to step #1 and repeat the strategy. (Ad)











1.
State Goal
In Positive
Form
2.
Visualize
Yourself
Achieving It
4. 5.
Does It Feel
Like I Can
Really Do It?
6. 7.
Name What Is
Needed Or What
Is Missing


Specific Steps of the New Behavior Generator Strategy


NOTE: You may add any number of needed resources to your goal statement so that when you are done you may have refined your goal to something like: "I want to be more assertive with my co-workers AND keep their feelings in mind as well AND maintain a sense of my own self confidence AND remain cool if someone gets angry."

References

Tools for Dreamers, Dilts, R. and Epstein, T., Meta Publications, Capitola, CA, 1991.

Also see the Article of the Month or the Archives if you are interested in checking out NLP in more depth.

For information on Robert Dilts’ products and services, please see Upcoming Seminars or Robert’s Product Page or return to Home Page. If you have problems or comments concerning our WWW service, please send e-mail to the following address: michaelp@bowsprit.com.

This page, and all contents, are Copyright © 1998 by Robert Dilts., Santa Cruz, CA.

Понятно. Значит, нужна следующая простая информация: список эмоций в каждом проекте, которые они диагностируют.
И всё встанет на свои места.
</>
[pic]
...

metanymous в посте Metapractice (оригинал в ЖЖ)

Ага! Отлично! Надо обсудить в отдельной теме!
</>
[pic]
...

bavi в посте Metapractice (оригинал в ЖЖ)

У Дилтса на сайте находил, а, вот, и ссылка: http://www.nlpu.com/Patterns/patt16.htm
http://metapractice.livejournal.com/497371.html
http://openmeta.livejournal.com/234097.html
20+ Emotion Recognition APIs That Will Leave You Impressed, and Concerned
Posted by Bill Doerrfeld

If businesses could sense emotion using tech at all times, they could capitalize on it to sell to the consumer in the opportune moment. Sound like 1984? The truth is that it’s not that far from reality. Machine emotional intelligence is a burgeoning frontier that could have huge consequences in not only advertising, but in new startups, healthcare, wearables, education, and more.

There’s a lot of API-accessible software online that parallels the human ability to discern emotive gestures. These algorithm driven APIs use use facial detection and semantic analysis to interpret mood from photos, videos, text, and speech. Today we explore over 20 emotion recognition APIs and SDKs that can be used in projects to interpret a user’s mood.

How Do Emotion Recognition APIs Work?

nviso-example-2Emotive analytics is an interesting blend of psychology and technology. Though arguably reductive, many facial expression detection tools lump human emotion into 7 main categories: Joy, Sadness, Anger, Fear, Surprise, Contempt, and Disgust. With facial emotion detection, algorithms detect faces within a photo or video, and sense micro expressions by analyzing the relationship between points on the face, based on curated databases compiled in academic environments.

To detect emotion in the written word, sentiment analysis processing software can analyze text to conclude if a statement is generally positive or negative based on keywords and their valence index. Lastly, sonic algorithms have been produced that analyze recorded speech for both tone and word content.

Use Cases For Emotion Recognition

Smile — you’re being watched. The visual detection market is expanding tremendously. It was recently estimated that the global advanced facial recognition market will grow from $2.77 Billion in 2015 to $6.19 Billion in 2020. Emotion recognition takes mere facial detection/recognition a step further, and its use cases are nearly endless.

An obvious use case is within group testing. User response to video games, commercials, or products can all be tested at a larger scale, with large data accumulated automatically, and thus more efficiently. Bentley used facial expression recognition in a marketing campaign to suggest car model types based on emotive responses to certain stimuli. Technology that reveals your feelings has also been suggested to spot struggling students in a classroom environment, or help autistics better interact with others. Some use cases include:


  • Helping to better measure TV ratings.

  • Adding another security layer to security at malls, airports, sports arenas, and other public venues to detect malicious intent.

  • Wearables that help autistics discern emotion

  • Check out counters, virtual shopping

  • Creating new virtual reality experiences

Facial Detection APIs that Recognize Mood

These computer vision APIs use facial detection, eye tracking, and specific facial position cues to determine a subject’s mood. There are many APIs that scan an image or video to detect faces, but these go the extra mile to spit back an emotive state. This is often a combination of weight assigned to 7 basic emotions, and valence — the subject’s overall sentiment.

1: Emotient

Emotient is great for an ad campaign that wants to track attention, engagement, and sentiment from viewers. The RESTful Emotient Web API can be integrated into apps, or used to help power AB testing. In addition to the API, there’s a good account analytics panel. View a demo here.

emotient-analytics

2: Affectiva

With 3,289,274 faces analyzed to date, Affectiva is another solution for massive scale engagement detection. They offer SDKs and APIs for mobile developers, and provide nice visual analytics to track expressions over time. Visit their test demo to graph data points in response to viewing various ads.

3: EmoVu

Produced by Eyeris, EmoVu facial detection products incorporate machine learning and micro expression detection that allow an agency to “accurately measure their content’s emotional engagement and effectiveness on their target audience.” With a Desktop SDK, Mobile SDK, and an API for fine grained control, EmoVu offers wide platform support, including many tracking features, like head position, tilt, eye tracking, eye open/close, and more. They offer a free demo with account creation.

Eyeris-demo

Looking for APIs? Check out API Discovery: 11 Ways to Find APIs

4: Nviso

Switzerland-based Nviso specializes in emotion video analytics, using 3D facial imaging tech to monitor many different facial data points to produce likelihoods for 7 main emotions. Though no free demo is offered, Nviso claims to provide a real-time imaging API. They have a reputation, awarded for smarter computing in 2013 by IBM. With its international corporate vibe, Nviso may not be the choice for a developer looking for a quick plug-in-play ability with immediate support.

5: Kairos

kairos-logoThe Emotion Analysis API by Kairos is a more SaaS-y startup in the facial detection arena. Scalable and on-demand, you send them video, and they send back coordinates that detect smiles, surprise, anger, dislike and drowsiness. They offer a Free Demo (no account setup required) that will analyze and graph your facial responses to a few commercial ads.

The sleek-branded Kairos could be a developer favorite. It looks newly supported with a growing community, with transparent documentation for its Face Recognition API , Crowd Analytics SDK, and Reporting API. The catch is that the Emotion Analysis API is still in beta.

6 : Project Oxford by Microsoft

Microsoft’s Project Oxford is a catalogue of artificial intelligence APIs focused on computer vision, speech, and language analysis. After the project’s age-guessing tool went viral last year for it’s “incongruities,” some may be reluctant to try Microsoft’s emotion detection capabilities (this is the app that thought Keanu was only 0.01831 sad).

Nordic APIs founders Travis Spencer and Andreas Krohn - 99% happy

Nordic APIs founders Travis Spencer and Andreas Krohn – 99% happy

The API only works with photos. It detects faces, and responds in JSON with ridiculously specific percentages for each face using the core 7 emotions, and Neutral. Truncate the decimals and this would be a very simple and to the point API, a very useful tool given the right situation. Upload a photo to the free online demo here to test Project Oxford’s computer vision capabilities.

7: Face Reader by Noldus

Used in the academic sphere, the Face Reader API by Noldus is based on machine learning, tapping into a database of 10,000 facial expression images. The API uses 500 key facial points to analyze 6 basic facial expressions as well as neutral and contempt. Face Reader also detects gaze direction and head orientation. Noldus seems to have a solid amount of research backing its software.

face-reader-noldus

8: Sightcorp

Sightcorp is another facial recognition provider. Their Insight SDK offers wide platform support, and tracks hundreds of facial points, eye gaze, and has been used in creative projects, museum showcases, and at TEDX Amsterdam. Sightcorp’s F.A.C.E. API (still in beta) is an cloud analysis engine for automated emotional expression detection.

Sightcorp-example-insight-sdk

9: SkyBiometry

SkyBiometry-logoSkyBiometry is a cloud-based face detection and recognition tool which allows you detect emotion in photos. Upload a file, and SkyBiometry detects faces, and senses the mood between happy, sad, angry, surprised, disgusted, scared, and neutral, with a percentage rate for each point. It accurately determines if a person is smiling or not. A benefit to Skybiometry is that it’s a spin off of a successful biometric company — so the team’s been around for a while. Check out their free demo to see how it works, and view their extensive online API documentation.

10: Face++

From their developer center, the onboarding process for Face++ looks intuitive. Face++ is more of a face recognition tool that compares faces with stored faces — perfect for name tagging photos in social networks. It makes our list because it does determine if a subject is smiling or not. Face++ has a wide set of developer SDKs in various languages, and an online demo.

faceplus-example

11: Imotions

Imotions is a biometrics research platform that provides software and hardware for monitoring many types of bodily cues. Imotions syncs with Emotient’s facial expression technology, and adds extra layers to detect confusion and frustration. The Imotions API can monitor video live feeds to extract valence, or can aggregate previously recorded videos to analyze for emotions. Imotion software has been used by Harvard, Procter & Gamble, Yale, the US Air Force, and was even used in a Mythbusters episode.

Imotions-example

12: CrowdEmotion

CrowdEmotion offers an API that uses facial recognition to detect the time series of the six universal emotions as defined by Psychologist Paul Ekman (happiness, surprise, anger, disgust, fear and sadness). Their online demo will analyze facial points in real-time video, and respond with detailed visualizations. They offer an API sandbox, along with free monthly usage for live testing. Check out the CloudEmotion API docs for specific information.

CrowdEmotion-demo-API

13: FacioMetrics

Founded at Carnegie Mellon University (CMU), FacioMetrics is a company that provides SDKs for incorporating face tracking, pose and gaze tracking, and expression analysis into apps. Their demo video outlines some creative use cases in virtual reality scenarios. The software can be tested using the Intraface iOS app.

faciometrics-intraface-app

Text to Emotion Software

There are many sentiment analysis APIs out there that provide categorization or entity extraction, but the APIs listed below specifically respond with an emotional summary given a body of plain text. Some keywords to understand here are natural language processing — the use of machines to detect “natural” human interaction, and deep linguistic analysis — the examination of sentence structure, and relationships between keywords to derive sentiment.

You could use these APIs to do things like inform social media engagement analytics, add new features to chat messaging, perform targeted news research, detect highly negative/positive customer experiences, or optimize publishing with AB testing.

14: IBM Watson

Powered by the supercomputer IBM Watson, The Tone Analyzer detects emotional tones, social propensities, and writing styles from any length of plain text. The API can be forked on GitHub. Input your own selection on the demo to see tone percentile, word count, and a JSON response. The IBM Watson Developer Cloud also powers other cool cognitive computing tools.

IBM watson

Also Read:Green APIs that Promote Sustainability and Climate Action

15: Receptiviti

Backed by decades of language-psychology research, the Receptiviti Natural Language Personality Analytics API uses a process of target words and emotive categories to derive emotion and personality from texts. Their Linguistic Inquiry and Word Count (LIWC) text analysis process is even used by IBM Watson. With REST API endpoints and SDKs in all major programming languages, Receptiviti looks both powerful and usable.

16: AlchemyAPI

The Alchemy API scans large chunks of text to determine the relevance of keywords and their associated negative/positive connotations to get a sense of attitude or opinion. You can enter a URL to receive a grade of positive, mixed, or negative overall sentiment. Though it’s more for defining taxonomies and keyword relevance, the tool does offer an overall sentiment evaluation for the document. Check out the demo or Sentiment Analysis API docs.

17: Bitext

The Text Analysis API by Bitext is another deep linguistic analysis tool. It can be used to analyze words relations, sentences, structure, and dependencies to extract bias with its “built-in sentiment scoring” functionality.

sentiment_analysis_screen-bitext

18: Mood Patrol

Hosted on the Mashape API marketplace, Mood Patrol by Soul Hackers Labs is a simple API that extracts emotion from text. Good for analyzing small sections of text for cues, and responding with fine grained adjectives that describe the emotional tone based on Plutchik’s 8 Basic Emotions. Visit the Soul Hackers demo or API documentation.

19: Synesketch

Synesketch is basically the iTunes artwork player for the written word. It’s an innovative open source tool that analyzes text for sentiment, and converts emotional tone into some awesome visualizations. Talk about emotional intelligence — “[Synesketch] code feels the words”, dynamically representing text in animated visual patterns so as to reveal underlying emotion. A few third-party apps have already been constructed with this open source software to recognize and visualize emotion from Tweets, speech, poetry, and more.

Synesketch

20: Tone API

The Tone API is a speedy SaaS API built for marketers to quantify the emotional response to their content. The tool takes a body of text and analyzes for emotional breadth, intensity, and comparison with other texts. Looks to be a cool service for automating in-house research to optimize smart content publishing.

tone-api-example

The Power of Third Party APIs:How Genews Uses Third Party APIs to Support Gender Equality

21: Repustate API

The Repustate Sentiment Analysis process is based in linguistic theory, and reviews cues from lemmatization, polarity, negations, part of speech, and more to reach an informed sentiment from a text document. Check out info on their Text Analytics API.

Speech to Emotion Software

Lastly, humans also interact with machines via speech. There are plenty of speech recognition APIs on the market, whose results could be processed by other sentiment analysis APIs listed above. Perhaps this is why an easy-to-consume web API that instantly recognizes emotion from recorded voice is rare. Use cases for this tech could be:


  • Monitoring customer support centers

  • Providing dispatch squads automated emotional intelligence

22: Good Vibrations

The Good Vibrations API senses mood from recorded voice. The API and SDK use universal biological signals to perform a real time analysis of the user’s emotion to sense stress, pleasure, or disorder.

They’re not really web APIs, but EMOSpeech is an enterprise software application that allows call centers to analyze emotion, and Audeering software detects emotion, tone, and gender in recorded voice.

Conclusion: The Future of Emotion Recognition

Machine emotional intelligence is still evolving, but the future could soon see targeted ads that respond to not only our demographic (age, gender, likes, etc.) but to our current emotional state. For point of sale advertising, this information could be leveraged to nudge sales when people are most emotionally vulnerable, getting into some murky ethical territory. Emotional recognition via facial detection is also shady if the user isn’t aware of their consent to be recorded visually. There are of course data privacy legalities any API provider or consumer should be aware of before implementation.

We are only on the tip of the iceberg when it comes to machine human interaction, but cognitive computing technologies like these are exciting steps toward creating true machine emotional intelligence.

Did we leave out any good Emotion Recognition APIs? Respond below or add to this Product Hunt list.


Ну, это как бы и инструмент, и идеальный конечный результат.
</>
[pic]
...

metanymous в посте Metapractice (оригинал в ЖЖ)

Ты имеешь в виду: в книге Дилтса "Структура субъективного опыта"?
</>
[pic]
...

bavi в посте Metapractice (оригинал в ЖЖ)

Хорошо. Встречал только у Дилтса на КГД

Дочитали до конца.