Sunday 30 June 2013

Healthcare Marketing Series - Data Mining - The 21st Century Marketing Gold Rush

There is gold in them there hills! Well there is gold right within a few blocks of your office. Mining for patients, not unlike mining for gold or drilling for oil requires either great luck or great research.

It's all about the odds.

It's true that like old Jed from the Beverly Hillbillies, you might just take a shot and strike oil. But more likely you might drill a dry hole or dig a mine and find dirt not diamonds. Without research you might be a mere 2 feet from pay dirt, but drilling or mining in just the wrong spot.

Now oil companies and gold mining companies spend millions, if not, billions of dollars studying where and how to effectively find the "mother load". If market research is good enough for the big boys, it should be good enough for the healthcare provider. Remember as a health care professional you probably don't have the extras millions laying around to squander on trial and error marketing.

If you did there would be little need for you to market to find new patients to help.

In previous articles in the Health Care Marketing Series we talked about developing a marketing strategy, using metrics to measure the performance of your marketing execution, developing effective marketing warheads based on your marketing strategy, evaluating the most efficient ways to deliver those warheads, your marketing missile systems, and tying several marketing methods together into a marketing MIRV.

If you have been following along with our articles and starting to integrate the concepts detailed in them, by now you should have an excellent marketing infrastructure. Ready to launch laser guided marketing missiles tipped with nuclear marketing MIRVs. The better you have done your research, the more detailed your marketing strategy, the more effective and efficient your delivery systems, the bigger bang you'll receive from your marketing campaign. And ultimately the more lives you will help to change of patients that truly can benefit from your skills and talents as a doctor.

Sounds like you're ready for healthcare marketing shock and awe.

Everything is ready to launch, this is great, press the button and fire away!

Ah, but wait just a minute, General. What is the target? Where are they? What are the aiming coordinates?

The target? Why of course all those sick people out there.

Where are they? Well of course, out there!

The coordinates? Man just press the button, carpet bomb man. Carpet bomb!

This scenario is designed to show you how quickly the wheels can come off even the best intended marketing war machine. It brings us back full circle. We are right back to our original article on marketing strategy.

But this time we are going to introduce the concept of data mining. If you remember, our article on marketing strategy talked about doing research. We talked about research as the true cornerstone of all marketing efforts.

What is the target, General?

Answering this question is a little difficult and the truth is each healthcare provider needs to determine his or her high value target. And more importantly needs to know how to determine his or her high value targets.

Let's go back to our launch scenario to illustrate this point. Let's continue with our military analogy. Let's say we have several aircraft carriers, a few destroyers and a fleet of rowboats, making up our marketing battlefield.

As we have discussed previously, waging a marketing war, like any war, consumes resources. So do we want to launch our nuclear marketing MIRVs, the most valuable resources in our arsenal, and target the fleet of rowboats?

Or would it be wiser to target those aircraft carriers?

Well the obvious answer is "get those carriers".

But here is where things get a little tricky. One man's aircraft carrier is another man's rowboat.

You have to data mine your practice to determine which targets are high value targets.

What goes into that data mining process? Well first and foremost, what conditions do you 1.like to treat, 2. have a proven track record of treating and 3. obtain a reasonable reimbursement for treating.

In my own practice, I typically do not like or enjoy treating shoulder problems. I don't know if I don't like treating shoulders because I haven't had great results with them or if I haven't had great results, because I don't like treating them. Needless to say my reimbursement for treating shoulder cases is relatively low.

So do I really want to carpet bomb my marketing terrain and come up with 10 new cases of rotator cuff tears? These cases, for more than one reason, are my rowboats.

On the contrary, I like to treat neurological conditions like chronic pain; Neuropathy patients, Spinal Stenosis patients, Tinnitus patients, patients with Parkinson's Disease and Multiple Sclerosis patients. I've had results with these types of cases that have been good enough to publish. Because they are complex and difficult cases, I obtain a better than average reimbursement for my efforts. These cases are my aircraft carriers. If my marketing campaign brings me ten cases with these types of problems, chances are that the patient will obtain some great relief, I will find working with them an intellectual and stimulating challenge and my marketing efforts will bring me a handsome return on investment.

So the first lesson of data mining is to identify your aircraft carriers. They must be "your" aircraft carriers. You must have a good personal track record of helping these types of patients. You should enjoy treating these types of cases. And you should be rewarded for your time and expertise.

That's the first step in the process. Identifying your high value targets. The next step is THE most important aspect of healthcare marketing. As I discussed above, I enjoy working with complex neurological cases. But how many of these types of patients exist in my marketing terrain and are they looking for the type of help I can offer?

Being able to accurately answer these important questions is the single most valuable information I can extract using data mining.

It doesn't matter if I like treating these cases. It doesn't matter if I make a good living treating these cases. It doesn't matter if my success in treating these cases has made the local news. What matters is 1. do these types of cases exist in my neighborhood and 2. are they looking for the help I can provide to them?

You absolutely positively need to know who is looking for what in your marketing terrain and if what people are clamoring for is what you have to offer.

This knowledge is the most powerful tool in your marketing arsenal. It's your secret weapon. It is the foundation of your marketing strategy. It is so important that you should consider moving your office if the results of your data mining don't reveal an ocean full of aircraft carriers in your marketing terrain for you to target.

If your market research does not reveal an abundance of aircraft carriers on your horizon, you need to either 1. move to a new battlefield, 2. re-target your efforts towards the destroyers in your market or 3. try to create a market.

Let's look at your last choice. Trying to create a market. Unless you are Coke or Pepsi, your ability to create a market as a health care provider is extremely limited. To continue on with our analogy, to create a market requires converting rowboats into, at least, destroyers, but better yet aircraft carriers.

What would it cost if you took a rowboat to a ship yard and told them to rebuild it as an aircraft carrier?

This is what you face if you try to create a market where none exists. Unless you have a personality flaw and thrive on selling ice to Eskimos, creating a market is not a rewarding proposition.

So scratch this option off the table right now.

What about re-targeting your campaign towards destroyers? That's a viable option. It's a good option. It's probably your best option. It's an option that will likely give you your best return on investment. It is recommended that you focus your arsenal on the destroyers while at the same time never passing on an opportunity to sink an aircraft carrier.

So what is the secret? How do you data mine for aircraft carriers?

Well its quite simple in the internet age. Just use the services of a market research firm. I like http://www.marketresearch.com They will do the data mining for you.

They can provide market intelligence that will tell you not only what the health care aircraft carriers are, but also where they are.

With this information, you will have a competitive advantage in your marketing battlefield. You can segment, and target high value targets in your area while your competitors squander their marketing resources on rowboats. Or even worse carpet bomb and hit ocean water, not valuable targets.

Your marketing strategy should be highly targeted. Your marketing resources should be well spent. As we discussed in our very first article on true "Marketing Strategy" you should enter the battle against your competition already knowing your have won.

What gives you this dominant position in the market, is knowing ahead-of-time, who is looking for what in your marketing terrain. In other words, not trying to create a market, but rather identifying existing market niches, specifically targeting them with laser guided precision and having headlines and ad copy based on your strength versus the weakness of your competition within that niche.

This research-based marketing strategy is sure to cause a big bang with potential patients.


Source: http://ezinearticles.com/?Healthcare-Marketing-Series---Data-Mining---The--21st-Century-Marketing-Gold-Rush&id=1486283

Friday 28 June 2013

Three Common Methods For Web Data Extraction

Probably the most common technique used traditionally to extract data from web pages this is to cook up some regular expressions that match the pieces you want (e.g., URL's and link titles). Our screen-scraper software actually started out as an application written in Perl for this very reason. In addition to regular expressions, you might also use some code written in something like Java or Active Server Pages to parse out larger chunks of text. Using raw regular expressions to pull out the data can be a little intimidating to the uninitiated, and can get a bit messy when a script contains a lot of them. At the same time, if you're already familiar with regular expressions, and your scraping project is relatively small, they can be a great solution.

Other techniques for getting the data out can get very sophisticated as algorithms that make use of artificial intelligence and such are applied to the page. Some programs will actually analyze the semantic content of an HTML page, then intelligently pull out the pieces that are of interest. Still other approaches deal with developing "ontologies", or hierarchical vocabularies intended to represent the content domain.

There are a number of companies (including our own) that offer commercial applications specifically intended to do screen-scraping. The applications vary quite a bit, but for medium to large-sized projects they're often a good solution. Each one will have its own learning curve, so you should plan on taking time to learn the ins and outs of a new application. Especially if you plan on doing a fair amount of screen-scraping it's probably a good idea to at least shop around for a screen-scraping application, as it will likely save you time and money in the long run.

So what's the best approach to data extraction? It really depends on what your needs are, and what resources you have at your disposal. Here are some of the pros and cons of the various approaches, as well as suggestions on when you might use each one:

Raw regular expressions and code

Advantages:

- If you're already familiar with regular expressions and at least one programming language, this can be a quick solution.

- Regular expressions allow for a fair amount of "fuzziness" in the matching such that minor changes to the content won't break them.

- You likely don't need to learn any new languages or tools (again, assuming you're already familiar with regular expressions and a programming language).

- Regular expressions are supported in almost all modern programming languages. Heck, even VBScript has a regular expression engine. It's also nice because the various regular expression implementations don't vary too significantly in their syntax.

Disadvantages:

- They can be complex for those that don't have a lot of experience with them. Learning regular expressions isn't like going from Perl to Java. It's more like going from Perl to XSLT, where you have to wrap your mind around a completely different way of viewing the problem.

- They're often confusing to analyze. Take a look through some of the regular expressions people have created to match something as simple as an email address and you'll see what I mean.

- If the content you're trying to match changes (e.g., they change the web page by adding a new "font" tag) you'll likely need to update your regular expressions to account for the change.

- The data discovery portion of the process (traversing various web pages to get to the page containing the data you want) will still need to be handled, and can get fairly complex if you need to deal with cookies and such.

When to use this approach: You'll most likely use straight regular expressions in screen-scraping when you have a small job you want to get done quickly. Especially if you already know regular expressions, there's no sense in getting into other tools if all you need to do is pull some news headlines off of a site.

Ontologies and artificial intelligence

Advantages:

- You create it once and it can more or less extract the data from any page within the content domain you're targeting.

- The data model is generally built in. For example, if you're extracting data about cars from web sites the extraction engine already knows what the make, model, and price are, so it can easily map them to existing data structures (e.g., insert the data into the correct locations in your database).

- There is relatively little long-term maintenance required. As web sites change you likely will need to do very little to your extraction engine in order to account for the changes.

Disadvantages:

- It's relatively complex to create and work with such an engine. The level of expertise required to even understand an extraction engine that uses artificial intelligence and ontologies is much higher than what is required to deal with regular expressions.

- These types of engines are expensive to build. There are commercial offerings that will give you the basis for doing this type of data extraction, but you still need to configure them to work with the specific content domain you're targeting.

- You still have to deal with the data discovery portion of the process, which may not fit as well with this approach (meaning you may have to create an entirely separate engine to handle data discovery). Data discovery is the process of crawling web sites such that you arrive at the pages where you want to extract data.

When to use this approach: Typically you'll only get into ontologies and artificial intelligence when you're planning on extracting information from a very large number of sources. It also makes sense to do this when the data you're trying to extract is in a very unstructured format (e.g., newspaper classified ads). In cases where the data is very structured (meaning there are clear labels identifying the various data fields), it may make more sense to go with regular expressions or a screen-scraping application.

Screen-scraping software

Advantages:

- Abstracts most of the complicated stuff away. You can do some pretty sophisticated things in most screen-scraping applications without knowing anything about regular expressions, HTTP, or cookies.

- Dramatically reduces the amount of time required to set up a site to be scraped. Once you learn a particular screen-scraping application the amount of time it requires to scrape sites vs. other methods is significantly lowered.

- Support from a commercial company. If you run into trouble while using a commercial screen-scraping application, chances are there are support forums and help lines where you can get assistance.

Disadvantages:

- The learning curve. Each screen-scraping application has its own way of going about things. This may imply learning a new scripting language in addition to familiarizing yourself with how the core application works.

- A potential cost. Most ready-to-go screen-scraping applications are commercial, so you'll likely be paying in dollars as well as time for this solution.

- A proprietary approach. Any time you use a proprietary application to solve a computing problem (and proprietary is obviously a matter of degree) you're locking yourself into using that approach. This may or may not be a big deal, but you should at least consider how well the application you're using will integrate with other software applications you currently have. For example, once the screen-scraping application has extracted the data how easy is it for you to get to that data from your own code?

When to use this approach: Screen-scraping applications vary widely in their ease-of-use, price, and suitability to tackle a broad range of scenarios. Chances are, though, that if you don't mind paying a bit, you can save yourself a significant amount of time by using one. If you're doing a quick scrape of a single page you can use just about any language with regular expressions. If you want to extract data from hundreds of web sites that are all formatted differently you're probably better off investing in a complex system that uses ontologies and/or artificial intelligence. For just about everything else, though, you may want to consider investing in an application specifically designed for screen-scraping.

As an aside, I thought I should also mention a recent project we've been involved with that has actually required a hybrid approach of two of the aforementioned methods. We're currently working on a project that deals with extracting newspaper classified ads. The data in classifieds is about as unstructured as you can get. For example, in a real estate ad the term "number of bedrooms" can be written about 25 different ways. The data extraction portion of the process is one that lends itself well to an ontologies-based approach, which is what we've done. However, we still had to handle the data discovery portion. We decided to use screen-scraper for that, and it's handling it just great. The basic process is that screen-scraper traverses the various pages of the site, pulling out raw chunks of data that constitute the classified ads. These ads then get passed to code we've written that uses ontologies in order to extract out the individual pieces we're after. Once the data has been extracted we then insert it into a database.


Source: http://ezinearticles.com/?Three-Common-Methods-For-Web-Data-Extraction&id=165416

Wednesday 26 June 2013

Effective Online Data Entry Services

The outsourcing market has many enthusiastic buyers who have paid a small amount to online data entry service providers. They carry the opinion that they have paid too low as against the work they have got done. Online services is helpful to a number of smaller business units who take these projects as their significant source of occupation.

Online data-entry services include data typing, product entry, web and mortgage research, data mining as well as extraction services. Service providers allot proficient workforce at your service who timely deliver best possible results. They have updated technology, guaranteeing 100% data security.

Few obvious benefits found by outsourcing online data entry:

    Business units receive quality online entry services from projects owners.
    Entering data is the first step for companies through which they get the understanding of the work that makes strategic decisions. The raw data represented by mere numbers soon turns to be a decision making factor accelerating the progress of the business.
    Systems used by these services are completely protected to maintain high level of security.
    As you increasingly obtain high quality of information the business executive of the company is expected to arrive at extraordinary decisions which influence progress in the company.
    Shortened turnaround time.
    Cutting down on cost by saving on operational overheads.

Companies are highly fascinated by the benefits of outsourcing your projects for these services, as it saves time as well as money.

Flourishing companies want to concentrate on their key business activities instead of exploring into such non-key business activities. They take a wise step of outsourcing their work to data-entry-services and keep themselves free for their core business functions.


Source: http://ezinearticles.com/?Effective-Online-Data-Entry-Services&id=5681261

Monday 24 June 2013

What's Your Excuse For Not Using Data Mining?

In an earlier article I briefly described how data mining and RFM analysis can help marketers be more efficient (read... increased marketing ROI!). These marketing analytics tools can significantly help with all direct marketing efforts (multichannel campaign management efforts using direct mail, email and call center) and some interactive marketing efforts as well. So, why aren't all companies using it today? Well, typically it comes down to a lack of data and/or statistical expertise. Even if you don't have data mining expertise, YOU can benefit from data mining by using a consultant. With that in mind, let's tackle the first problem -- collecting and developing the data that is useful for data mining.

The most important data to collect for data mining include:

oTransaction data - For every sale, you at least need to know the product and the amount and date of the purchase.

oPast campaign response data - For every campaign you've run, you need to identify who responded and who didn't. You may need to use direct and indirect response attribution.

oGeo-demographic data - This is optional, but you may want to append your customer file/database with consumer overlay data from companies like Acxiom.

oLifestyle data - This is also an optional append of indicators of socio-economic lifestyle that are developed by companies like Claritas. All of the above data may or may not exist in the same data source. Some companies have a single holistic view of the customer in a database and some don't. If you don't, you'll have to make sure all data sources that contain customer data have the same customer ID/key. That way, all of the needed data can be brought together for data mining.

How much data do you need for data mining? You'll hear many different answers, but I like to have at least 15,000 customer records to have confidence in my results.

Once you have the data, you need to massage it to get it ready to be "baked" by your data mining application. Some data mining applications will automatically do this for you. It's like a bread machine where you put in all the ingredients -- they automatically get mixed, the bread rises, bakes, and is ready for consumption! Some notable companies that do this include KXEN, SAS, and SPSS. Even if you take the automated approach, it's helpful to understand what kinds of things are done to the data prior to model building.

Preparation includes:

oMissing data analysis. What fields have missing values? Should you fill in the missing values? If so, what values do you use? Should the field be used at all?

oOutlier detection. Is "33 children in a household" extreme? Probably - and consequently this value should be adjusted to perhaps the average or maximum number of children in your customer's households.

oTransformations and standardizations. When various fields have vastly different ranges (e.g., number of children per household and income), it's often helpful to standardize or normalize your data to get better results. It's also useful to transform data to get better predictive relationships. For instance, it's common to transform monetary variables by using their natural logs.

oBinning Data. Binning continuous variables is an approach that can help with noisy data. It is also required by some data mining algorithms.

More to come on data mining for marketers in my next article.


Source: http://ezinearticles.com/?Whats-Your-Excuse-For-Not-Using-Data-Mining?&id=3576029

Friday 21 June 2013

Collecting Data With Web Scrapers

There is a large amount of data available only through websites. However, as many people have found out, trying to copy data into a usable database or spreadsheet directly out of a website can be a tiring process. Data entry from internet sources can quickly become cost prohibitive as the required hours add up. Clearly, an automated method for collating information from HTML-based sites can offer huge management cost savings.

Web scrapers are programs that are able to aggregate information from the internet. They are capable of navigating the web, assessing the contents of a site, and then pulling data points and placing them into a structured, working database or spreadsheet. Many companies and services will use programs to web scrape, such as comparing prices, performing online research, or tracking changes to online content.

Let's take a look at how web scrapers can aid data collection and management for a variety of purposes.

Improving On Manual Entry Methods

Using a computer's copy and paste function or simply typing text from a site is extremely inefficient and costly. Web scrapers are able to navigate through a series of websites, make decisions on what is important data, and then copy the info into a structured database, spreadsheet, or other program. Software packages include the ability to record macros by having a user perform a routine once and then have the computer remember and automate those actions. Every user can effectively act as their own programmer to expand the capabilities to process websites. These applications can also interface with databases in order to automatically manage information as it is pulled from a website.

Aggregating Information

There are a number of instances where material stored in websites can be manipulated and stored. For example, a clothing company that is looking to bring their line of apparel to retailers can go online for the contact information of retailers in their area and then present that information to sales personnel to generate leads. Many businesses can perform market research on prices and product availability by analyzing online catalogues.

Data Management

Managing figures and numbers is best done through spreadsheets and databases; however, information on a website formatted with HTML is not readily accessible for such purposes. While websites are excellent for displaying facts and figures, they fall short when they need to be analyzed, sorted, or otherwise manipulated. Ultimately, web scrapers are able to take the output that is intended for display to a person and change it to numbers that can be used by a computer. Furthermore, by automating this process with software applications and macros, entry costs are severely reduced.

This type of data management is also effective at merging different information sources. If a company were to purchase research or statistical information, it could be scraped in order to format the information into a database. This is also highly effective at taking a legacy system's contents and incorporating them into today's systems.


Source: http://ezinearticles.com/?Collecting-Data-With-Web-Scrapers&id=4223877

Thursday 20 June 2013

Assuring Scraping Success with Proxy Data Scraping


Have you ever heard of "Data Scraping?" Data Scraping is the process of collecting useful data that has been placed in the public domain of the internet (private areas too if conditions are met) and storing it in databases or spreadsheets for later use in various applications. Data Scraping technology is not new and many a successful businessman has made his fortune by taking advantage of data scraping technology.

Sometimes website owners may not derive much pleasure from automated harvesting of their data. Webmasters have learned to disallow web scrapers access to their websites by using tools or methods that block certain ip addresses from retrieving website content. Data scrapers are left with the choice to either target a different website, or to move the harvesting script from computer to computer using a different IP address each time and extract as much data as possible until all of the scraper's computers are eventually blocked.

Thankfully there is a modern solution to this problem. Proxy Data Scraping technology solves the problem by using proxy IP addresses. Every time your data scraping program executes an extraction from a website, the website thinks it is coming from a different IP address. To the website owner, proxy data scraping simply looks like a short period of increased traffic from all around the world. They have very limited and tedious ways of blocking such a script but more importantly -- most of the time, they simply won't know they are being scraped.

You may now be asking yourself, "Where can I get Proxy Data Scraping Technology for my project?" The "do-it-yourself" solution is, rather unfortunately, not simple at all. Setting up a proxy data scraping network takes a lot of time and requires that you either own a bunch of IP addresses and suitable servers to be used as proxies, not to mention the IT guru you need to get everything configured properly. You could consider renting proxy servers from select hosting providers, but that option tends to be quite pricey but arguably better than the alternative: dangerous and unreliable (but free) public proxy servers.

There are literally thousands of free proxy servers located around the globe that are simple enough to use. The trick however is finding them. Many sites list hundreds of servers, but locating one that is working, open, and supports the type of protocols you need can be a lesson in persistence, trial, and error. However if you do succeed in discovering a pool of working public proxies, there are still inherent dangers of using them. First off, you don't know who the server belongs to or what activities are going on elsewhere on the server. Sending sensitive requests or data through a public proxy is a bad idea. It is fairly easy for a proxy server to capture any information you send through it or that it sends back to you. If you choose the public proxy method, make sure you never send any transaction through that might compromise you or anyone else in case disreputable people are made aware of the data.

A less risky scenario for proxy data scraping is to rent a rotating proxy connection that cycles through a large number of private IP addresses. There are several of these companies available that claim to delete all web traffic logs which allows you to anonymously harvest the web with minimal threat of reprisal. Companies such as http://www.Anonymizer.com offer large scale anonymous proxy solutions, but often carry a fairly hefty setup fee to get you going.

The other advantage is that companies who own such networks can often help you design and implementation of a custom proxy data scraping program instead of trying to work with a generic scraping bot. After performing a simple Google search, I quickly found one company (www.ScrapeGoat.com) that provides anonymous proxy server access for data scraping purposes. Or, according to their website, if you want to make your life even easier, ScrapeGoat can extract the data for you and deliver it in a variety of different formats often before you could even finish configuring your off the shelf data scraping program.

Whichever path you choose for your proxy data scraping needs, don't let a few simple tricks thwart you from accessing all the wonderful information stored on the world wide web!


Source: http://ezinearticles.com/?Assuring-Scraping-Success-with-Proxy-Data-Scraping&id=248993

Tuesday 18 June 2013

Digging Up Dollars With Data Mining - An Executive's Guide


Introduction

Traditionally, organizations use data tactically - to manage operations. For a competitive edge, strong organizations use data strategically - to expand the business, to improve profitability, to reduce costs, and to market more effectively. Data mining (DM) creates information assets that an organization can leverage to achieve these strategic objectives.

In this article, we address some of the key questions executives have about data mining. These include:

    What is data mining?
    What can it do for my organization?
    How can my organization get started?

Business Definition of Data Mining

Data mining is a new component in an enterprise's decision support system (DSS) architecture. It complements and interlocks with other DSS capabilities such as query and reporting, on-line analytical processing (OLAP), data visualization, and traditional statistical analysis. These other DSS technologies are generally retrospective. They provide reports, tables, and graphs of what happened in the past. A user who knows what she's looking for can answer specific questions like: "How many new accounts were opened in the Midwest region last quarter," "Which stores had the largest change in revenues compared to the same month last year," or "Did we meet our goal of a ten-percent increase in holiday sales?"

We define data mining as "the data-driven discovery and modeling of hidden patterns in large volumes of data." Data mining differs from the retrospective technologies above because it produces models - models that capture and represent the hidden patterns in the data. With it, a user can discover patterns and build models automatically, without knowing exactly what she's looking for. The models are both descriptive and prospective. They address why things happened and what is likely to happen next. A user can pose "what-if" questions to a data-mining model that can not be queried directly from the database or warehouse. Examples include: "What is the expected lifetime value of every customer account," "Which customers are likely to open a money market account," or "Will this customer cancel our service if we introduce fees?"

The information technologies associated with DM are neural networks, genetic algorithms, fuzzy logic, and rule induction. It is outside the scope of this article to elaborate on all of these technologies. Instead, we will focus on business needs and how data mining solutions for these needs can translate into dollars.

Mapping Business Needs to Solutions and Profits

What can data mining do for your organization? In the introduction, we described several strategic opportunities for an organization to use data for advantage: business expansion, profitability, cost reduction, and sales and marketing. Let's consider these opportunities very concretely through several examples where companies successfully applied DM.

Expanding your business: Keystone Financial of Williamsport, PA, wanted to expand their customer base and attract new accounts through a LoanCheck offer. To initiate a loan, a recipient just had to go to a Keystone branch and cash the LoanCheck. Keystone introduced the $5000 LoanCheck by mailing a promotion to existing customers.

The Keystone database tracks about 300 characteristics for each customer. These characteristics include whether the person had already opened loans in the past two years, the number of active credit cards, the balance levels on those cards, and finally whether or not they responded to the $5000 LoanCheck offer. Keystone used data mining to sift through the 300 customer characteristics, find the most significant ones, and build a model of response to the LoanCheck offer. Then, they applied the model to a list of 400,000 prospects obtained from a credit bureau.

By selectively mailing to the best-rated prospects determined by the DM model, Keystone generated $1.6M in additional net income from 12,000 new customers.

Reducing costs: Empire Blue Cross/Blue Shield is New York State's largest health insurer. To compete with other healthcare companies, Empire must provide quality service and minimize costs. Attacking costs in the form of fraud and abuse is a cornerstone of Empire's strategy, and it requires considerable investigative skill as well as sophisticated information technology.

The latter includes a data mining application that profiles each physician in the Empire network based on patient claim records in their database. From the profile, the application detects subtle deviations in physician behavior relative to her/his peer group. These deviations are reported to fraud investigators as a "suspicion index." A physician who performs a high number of procedures per visit, charges 40% more per patient, or sees many patients on the weekend would be flagged immediately from the suspicion index score.

What has this DM effort returned to Empire? In the first three years, they realized fraud-and-abuse savings of $29M, $36M, and $39M respectively.

Improving sales effectiveness and profitability: Pharmaceutical sales representatives have a broad assortment of tools for promoting products to physicians. These tools include clinical literature, product samples, dinner meetings, teleconferences, golf outings, and more. Knowing which promotions will be most effective with which doctors is extremely valuable since wrong decisions can cost the company hundreds of dollars for the sales call and even more in lost revenue.

The reps for a large pharmaceutical company collectively make tens of thousands of sales calls. One drug maker linked six months of promotional activity with corresponding sales figures in a database, which they then used to build a predictive model for each doctor. The data-mining models revealed, for instance, that among six different promotional alternatives, only two had a significant impact on the prescribing behavior of physicians. Using all the knowledge embedded in the data-mining models, the promotional mix for each doctor was customized to maximize ROI.

Although this new program was rolled out just recently, early responses indicate that the drug maker will exceed the $1.4M sales increase originally projected. Given that this increase is generated with no new promotional spending, profits are expected to increase by a similar amount.

Looking back at this set of examples, we must ask, "Why was data mining necessary?" For Keystone, response to the loan offer did not exist in the new credit bureau database of 400,000 potential customers. The model predicted the response given the other available customer characteristics. For Empire, the suspicion index quantified the differences between physician practices and peer (model) behavior. Appropriate physician behavior was a multi-variable aggregate produced by data mining - once again, not available in the database. For the drug maker, the promotion and sales databases contained the historical record of activity. An automated data mining method was necessary to model each doctor and determine the best combination of promotions to increase future sales.

Getting Started

In each case presented above, data mining yielded significant benefits to the business. Some were top-line results that increased revenues or expanded the customer base. Others were bottom-line improvements resulting from cost-savings and enhanced productivity. The natural next question is, "How can my organization get started and begin to realize the competitive advantages of DM?"

In our experience, pilot projects are the most successful vehicles for introducing data mining. A pilot project is a short, well-planned effort to bring DM into an organization. Good pilot projects focus on one very specific business need, and they involve business users up front and throughout the project. The duration of a typical pilot project is one to three months, and it generally requires 4 to 10 people part-time.

The role of the executive in such pilot projects is two-pronged. At the outset, the executive participates in setting the strategic goals and objectives for the project. During the project and prior to roll out, the executive takes part by supervising the measurement and evaluation of results. Lack of executive sponsorship and failure to involve business users are two primary reasons DM initiatives stall or fall short.

In reading this article, perhaps you've developed a vision and want to proceed - to address a pressing business problem by sponsoring a data mining pilot project. Twisting the old adage, we say "just because you should doesn't mean you can." Be aware that a capability assessment needs to be an integral component of a DM pilot project. The assessment takes a critical look at data and data access, personnel and their skills, equipment, and software. Organizations typically underestimate the impact of data mining (and information technology in general) on their people, their processes, and their corporate culture. The pilot project provides a relatively high-reward, low-cost, and low-risk opportunity to quantify the potential impact of DM.

Another stumbling block for an organization is deciding to defer any data mining activity until a data warehouse is built. Our experience indicates that, oftentimes, DM could and should come first. The purpose of the data warehouse is to provide users the opportunity to study customer and market behavior both retrospectively and prospectively. A data mining pilot project can provide important insight into the fields and aggregates that need to be designed into the warehouse to make it really valuable. Further, the cost savings or revenue generation provided by DM can provide bootstrap funding for a data warehouse or related initiatives.

Recapping, in this article we addressed the key questions executives have about data mining - what it is, what the benefits are, and how to get started. Armed with this knowledge, begin with a pilot project. From there, you can continue building the data mining capability in your organization; to expand your business, improve profitability, reduce costs, and market your products more effectively.


Source: http://ezinearticles.com/?Digging-Up-Dollars-With-Data-Mining---An-Executives-Guide&id=6052872

Friday 14 June 2013

Advantages of Data Mining in Various Businesses

Data mining techniques have advantages for several types of businesses, as well as there are more to be discovered over time. Since the era of the computer, things have been changing pretty quickly and every new step in the technology is equivalent to a revolution. Communication itself has not been enough. As compared to the present times, the data analyzers in the past have not achieved the chance to go further with the data they have in hand. Today, this data isn't used for selling more of a product but to foresee future risks as well as prevent them.

All are benefiting from modern these techniques even from smaller to large enterprises. They can now predict the outcome of a particular marketing campaign by analyzing them. However, in order for these techniques to be successful, the data must be arranged accurately. If your data is disseminated, you need to bring it in a meeting and then feed into the systems for the algorithms to figure it out. To put it shortly, no matter how small or big your business might be you always need to have the right system when collecting data from your customers, transactions and all business activities.

Advantages of Data Mining For Businesses

Businesses can truly benefit from its latest techniques; however, in the future, data mining techniques are expected to be even more concise and effective than they are today. Here are the essential techniques that you need to understand:

· Big companies providing the free web based email services can use data mining techniques to catch spam emails from their customer's inboxes. Their software uses a technique to assess whether an email is a spam or not. These techniques are first tested and validated before they are finally used. This is to ensure they are producing the correct results.

· Large retail stores and even shopping malls could make use of these techniques by registering and recording the transactions made by their customers. When customers are buying particular sets of product, it can give them a good understanding of placing these items in the aisle. If they want to change the order and placement of the item on weekends, it could be found out after analyzing the data on their database.

· Companies manufacturing edible or drinkable products could easily use data mining techniques to increase their sales in a particular area and launch new products based on the information they've obtained. That's why the conventional statistical analysis is rigid in scenarios wherein consumer behavior is in question. However, these techniques still manages to give you good analysis for any situations.

· In call centers, the human interaction is at its peak because people are talking with another people at all times. Customers respond differently when they talk to a female representative as opposed to talking to a male representative. The response of customers to an infomercial is different from their response to an ad in the newspaper. Data could be used for the benefit of the business and is best understood with the use of data mining techniques.

· Data mining techniques are also being used in sports today for analyzing the performances of players in the field. Any game could be analyzed with the help of these techniques; even the behaviors of players could be changed on the field through this.

In short, data mining techniques are giving the organizations, enterprises and smaller businesses the power of focusing on their most productive areas. These techniques also allow stores and companies to innovate their current selling techniques by unveiling the hidden trends of their customer's behavior, background, price of the products, placement, closeness to the related products and many more.



Source: http://ezinearticles.com/?Advantages-of-Data-Mining-in-Various-Businesses&id=7568546

Thursday 13 June 2013

Web Data Extraction Services and Data Collection Form Website Pages

For any business market research and surveys plays crucial role in strategic decision making. Web scrapping and data extraction techniques help you find relevant information and data for your business or personal use. Most of the time professionals manually copy-paste data from web pages or download a whole website resulting in waste of time and efforts.

Instead, consider using web scraping techniques that crawls through thousands of website pages to extract specific information and simultaneously save this information into a database, CSV file, XML file or any other custom format for future reference.

Examples of web data extraction process include:
• Spider a government portal, extracting names of citizens for a survey
• Crawl competitor websites for product pricing and feature data
• Use web scraping to download images from a stock photography site for website design

Automated Data Collection
Web scraping also allows you to monitor website data changes over stipulated period and collect these data on a scheduled basis automatically. Automated data collection helps you discover market trends, determine user behavior and predict how data will change in near future.

Examples of automated data collection include:
• Monitor price information for select stocks on hourly basis
• Collect mortgage rates from various financial firms on daily basis
• Check whether reports on constant basis as and when required

Using web data extraction services you can mine any data related to your business objective, download them into a spreadsheet so that they can be analyzed and compared with ease.

In this way you get accurate and quicker results saving hundreds of man-hours and money!

With web data extraction services you can easily fetch product pricing information, sales leads, mailing database, competitors data, profile data and many more on a consistent basis.

Should you have any queries regarding Web Data extraction services, please feel free to contact us. We would strive to answer each of your queries in detail. Email us at info@outsourcingwebresearch.com


Source: http://ezinearticles.com/?Web-Data-Extraction-Services-and-Data-Collection-Form-Website-Pages&id=4860417

Tuesday 11 June 2013

Web Screen Scraping, Screen Scraping, Website Scraping, Web Page Scraping, Web Scraper

Web scraping is a technique used for extracting the information from different websites. They make use of software programs which simulate a human who surfs the internet to gather information. A human browser would enter the url, request the web page, copy the information and paste it. Similarly the programs or scripts are written in such a way that the software establishes a connection with the server and requests the web page, the server then sends an acknowledgement and the pages requested. The scripts then capture the data and store them as structured data.

Web scraping is implemented using HTTP protocol or by embedding web browsers. The aim of web scraping is to capture unstructured data from the target websites and convert them into structured data which can be stored and maintained in database for any future use. With the growing usage of internet for daily activities like weather monitoring, information gathering, price comparison etc, web scraping has become a great necessity.

Most of the data present in websites are of HTML format which are machine readable. The process of extracting data from HTML web pages is called as web screen scraping. Screen scraping uses software programs or scripts written to read the data from terminal port or the screen rather than the database. This enables the extraction of data in human readable format.

Website scraping enables extracting information from various websites where they are stored reaching out even to the hidden ones. Web page scraping involves collecting information from target websites and saving the data in a new database to enable easy filtering and sorting the data. The web scrapers are designed in such a way that they gather the required information; convert the unstructured data into a structured format, save them data by assembling them in a proper way for future usage. The output data can be saved in any database, spreadsheet, text file or any other required format.

The major advantages of using web scraping tools are accuracy and efficiency. The manual work of searching for the information, gathering the data, copying and pasting would take a lot of time making the job boring and tiresome. The web scrapers complete the task in very less time making the whole process easier. The manual work may not provide that accurate data while the web page scraping tools provide great accuracy. These tools also enable retrieving any type of data and images i.e. text, word, pdf, jpeg or gif from websites having different technologies like php, html, jsp, asp, java script, ajax etc. The scraped data can also be converted into desired format like XML, CSV, EXCEL or databases like MS Access, MS-SQL, MySQL etc.

With the availability of web scraping tools, gathering information is no more time consuming. One need not spend hours to complete such a simple task. The scrapers do the work for you.


Source: http://www.articlesnatch.com/Article/Web-Screen-Scraping--Screen-Scraping--Website-Scraping--Web-Page-Scraping--Web-Scraper/923102#.Ubgpa9gY_Dc

Saturday 8 June 2013

How Web Data Extraction Services Will Save Your Time and Money by Automatic Data Collection

Data scrape is the process of extracting data from web by using software program from proven website only. Extracted data any one can use for any purposes as per the desires in various industries as the web having every important data of the world. We provide best of the web data extracting software. We have the expertise and one of kind knowledge in web data extraction, image scrapping, screen scrapping, email extract services, data mining, web grabbing.

Who can use Data Scraping Services?

Data scraping and extraction services can be used by any organization, company, or any firm who would like to have a data from particular industry, data of targeted customer, particular company, or anything which is available on net like data of email id, website name, search term or anything which is available on web. Most of time a marketing company like to use data scraping and data extraction services to do marketing for a particular product in certain industry and to reach the targeted customer for example if X company like to contact a restaurant of California city, so our software can extract the data of restaurant of California city and a marketing company can use this data to market their restaurant kind of product. MLM and Network marketing company also use data extraction and data scrapping services to to find a new customer by extracting data of certain prospective customer and can contact customer by telephone, sending a postcard, email marketing, and this way they build their huge network and build large group for their own product and company.

We helped many companies to find particular data as per their need for example.

Web Data Extraction

Web pages are built using text-based mark-up languages (HTML and XHTML), and frequently contain a wealth of useful data in text form. However, most web pages are designed for human end-users and not for ease of automated use. Because of this, tool kits that scrape web content were created. A web scraper is an API to extract data from a web site. We help you to create a kind of API which helps you to scrape data as per your need. We provide quality and affordable web Data Extraction application

Data Collection

Normally, data transfer between programs is accomplished using info structures suited for automated processing by computers, not people. Such interchange formats and protocols are typically rigidly structured, well-documented, easily parsed, and keep ambiguity to a minimum. Very often, these transmissions are not human-readable at all. That's why the key element that distinguishes data scraping from regular parsing is that the output being scraped was intended for display to an end-user.

Email Extractor

A tool which helps you to extract the email ids from any reliable sources automatically that is called a email extractor. It basically services the function of collecting business contacts from various web pages, HTML files, text files or any other format without duplicates email ids.

Screen scrapping

Screen scraping referred to the practice of reading text information from a computer display terminal's screen and collecting visual data from a source, instead of parsing data as in web scraping.

Data Mining Services

Data Mining Services is the process of extracting patterns from information. Datamining is becoming an increasingly important tool to transform the data into information. Any format including MS excels, CSV, HTML and many such formats according to your requirements.

Web spider

A Web spider is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Many sites, in particular search engines, use spidering as a means of providing up-to-date data.

Web Grabber

Web grabber is just a other name of the data scraping or data extraction.

Web Bot

Web Bot is software program that is claimed to be able to predict future events by tracking keywords entered on the Internet. Web bot software is the best program to pull out articles, blog, relevant website content and many such website related data We have worked with many clients for data extracting, data scrapping and data mining they are really happy with our services we provide very quality services and make your work data work very easy and automatic.


Source: http://ezinearticles.com/?How-Web-Data-Extraction-Services-Will-Save-Your-Time-and-Money-by-Automatic-Data-Collection&id=5159023

Thursday 6 June 2013

Data Mining Explained

Overview
Data mining is the crucial process of extracting implicit and possibly useful information from data. It uses analytical and visualization techniques to explore and present information in a format which is easily understandable by humans.

Data mining is widely used in a variety of profiling practices, such as fraud detection, marketing research, surveys and scientific discovery.

In this article I will briefly explain some of the fundamentals and its applications in the real world.

Herein I will not discuss related processes of any sorts, including Data Extraction and Data Structuring.

The Effort
Data Mining has found its application in various fields such as financial institutions, health-care & bio-informatics, business intelligence, social networks data research and many more.

Businesses use it to understand consumer behavior, analyze buying patterns of clients and expand its marketing efforts. Banks and financial institutions use it to detect credit card frauds by recognizing the patterns involved in fake transactions.

The Knack
There is definitely a knack to Data Mining, as there is with any other field of web research activities. That is why it is referred as a craft rather than a science. A craft is the skilled practicing of an occupation.

One point I would like to make here is that data mining solutions offers an analytical perspective into the performance of a company depending on the historical data but one need to consider unknown external events and deceitful activities. On the flip side it is more critical especially for Regulatory bodies to forecast such activities in advance and take necessary measures to prevent such events in future.

In Closing
There are many important niches of Web Data Research that this article has not covered. But I hope that this article will provide you a stage to drill down further into this subject, if you want to do so!

Should you have any queries, please feel free to mail me. I would be pleased to answer each of your queries in detail.


Source: http://ezinearticles.com/?Data-Mining-Explained&id=4341782

Monday 3 June 2013

Beneficial Data Collection Services

Internet is becoming the biggest source for information gathering. Varieties of search engines are available over the World Wide Web which helps in searching any kind of information easily and quickly. Every business needs relevant data for their decision making for which market research plays a crucial role. One of the services booming very fast is the data collection services. This data mining service helps in gathering relevant data which is hugely needed for your business or personal use.

Traditionally, data collection has been done manually which is not very feasible in case of bulk data requirement. Although people still use manual copying and pasting of data from Web pages or download a complete Web site which is shear wastage of time and effort. Instead, a more reliable and convenient method is automated data collection technique. There is a web scraping techniques that crawls through thousands of web pages for the specified topic and simultaneously incorporates this information into a database, XML file, CSV file, or other custom format for future reference. Few of the most commonly used web data extraction processes are websites which provide you information about the competitor's pricing and featured data; spider is a government portal that helps in extracting the names of citizens for an investigation; websites which have variety of downloadable images.

Aside, there is a more sophisticated method of automated data collection service. Here, you can easily scrape the web site information on daily basis automatically. This method greatly helps you in discovering the latest market trends, customer behavior and the future trends. Few of the major examples of automated data collection solutions are price monitoring information; collection of data of various financial institutions on a daily basis; verification of different reports on a constant basis and use them for taking better and progressive business decisions.

While using these service make sure you use the right procedure. Like when you are retrieving data download it in a spreadsheet so that the analysts can do the comparison and analysis properly. This will also help in getting accurate results in a faster and more refined manner.


Source: http://ezinearticles.com/?Beneficial-Data-Collection-Services&id=5879822