Tuesday, December 1, 2009

Offshore Software Development



If search is desire to find, then data from Google search shows interest (desire) in offshore software development has decreased and then plateaued in the last three years. Although it might be difficult to determine spending on offshore development, but one might think it would have increased during the all familiar "down sizing" period. Since "cost cutting" is synonymous with down sizing, you would have expected to see more interest in offshore development. And since assumption is that decline in search is indicative of desire, then it doesn't seem much of the offshore development talent has been utilized.
Perhaps it's the abundance of readily available talent at home is the reason. Clearly, the current economic slowdown has shored up much motivation across the development workforce in U.S. I am not suggesting that hourly pay for software development and support in U.S. has plummeted to match that of offshore centers, although in some areas it seems it has. But the cost differentiators have clearly diminished.
Some suggest this to be a temporary situation, bound to the economic health in U.S.
What do you think? Are U.S. companies using more local talent than the 2000-2004 period? Would the tide turn as the economy does?

Tuesday, November 17, 2009

Configuration Locking in IIS 7

IIS 7 has implemented a new feature called "Configuration Locking". This is to make it easier for administrators to store configuration files on a network share, typical in a web server farms.

a common problem that occures with configuration locking is, you get an error similar to this:

HTTP Error 500.19 - internal server error. The requested page cannot be accessed because the related configuration data for the page is invalid.


In most cases you will need to "allow" access (unlock) the handler or module section in your IIS "applicationHost.config" file in a Notepad. Here is how:

Open the applicationHost.config file, located here: %windir%\system32\inetsrv\config\applicationHost.config

Always make a copy of this file before making any changes to it.

Change this line:

section name="handlers" overrideModeDefault="Deny"

To:

section name="handlers" overrideModeDefault="Allow"

If you continue to receive this error, repeat the same for:

section name="modules" allowDefinition="MachineToApplication" overrideModeDefault="Deny"


Steve Bashiri

Thursday, October 1, 2009

Web Analytics Governance and Metrics Standard

An effective web analytics practice needs proper staffing, management, planning, and diligence. To be effective, a governance committee must be formed, consisting of individuals with specific roles and responsibilities (see table below). With the right resources and clear expectations, the online business can put valuable data into action to achieve higher performance. But, how do you obtain this valuable data?

Data must traverse several stages before it is ready to be acted upon. From visitor generated raw format to the actionable metrics, data must go through processes similar to a typical factory before it’s consumable. Once raw data is sessionized and aggregated, it must go through segmentation and calculation. An analyst then slices and dices the data to find correlations and nuggets of information that is valuable to the business. Data analysis generally produces many findings. With data sharing and collaboration within the governance committee, findings can be pared down to a few high value, high confidence metrics. These will then be communicated as recommendations to the business users.

An online data driven organization usually has many contributors to the decision making process. These data consumers have many sources from which to extract data from. Even though the nature of data will be different for each of these sources (ie. marketing, finance, or technology) there need to be a baseline. A standard for data means having a common denominator, a similar yard stick by which everyone gathers and analyzes data. Some examples include how online visitor interactions are measured, eg. organization wide acceptance of measuring visitors (by cookie, login ID, or session parameter), or what constitutes a page view and does it accurately represent visitor’s interaction with business content, or what method should be used to represent bounced visits. Is it all visits with single page views, or only single page views from a filtered list of IP addresses, or those from “known” marketing programs (SEO, SEM, eMail, banner, and affiliates). The final decision on the organization’s metrics standard will depend as much on the nature of the business, as it does on the individuals in the governance committee. However, once established, everyone’s analysis and findings will be based on these standards. And only then the collective wisdom of the organization wide data becomes far more valuable than any of its single contributor.


Business User

Role

  • Responsible for budget
  • Interface with business analyst
  • Acts upon analysis and recommendations

Tools

  • High level knowledge

Analysis

  • Light analysis

Technology

  • High level knowledge

Management

  • Budgets and other dept/Org resources

Web Analyst

Role

  • Interpreting web data
  • finds nuggets of high value information
  • Focus on performance and optimization of the online properties
  • makes recommendations

Tools

  • power user
  • can interface and extract data from all tools
  • configures tools

Analysis

  • Deep analysis
  • slice and dices data
  • interprets quantitative and qualitative data

Technology

  • Good understanding of designs and methods
  • Assists on evaluation and recommendations

Management

  • Little to no duties

Business Analyst

Role

  • Interfaces with business users
  • Deep understanding of the websites
  • Gathers business requirements


Tools

  • Interfaces with tools to extract data
  • Designs reporting solutions


Analysis

  • Analyzes online and offline data
  • Documents requirements


Technology

  • Conceptual knowledge

Management

  • Some budget and human resource


Developer

Role

  • Develops tagging and programming to capture business data
  • Interfaces with web analyst

Tools

  • Deep knowledge of web analytics tools
  • Designs and develops best practices
  • Documents technical requirements

Analysis

  • Light analysis
  • Reviews technical data for optimum online system performance and availability

Technology

  • Deep knowledge of the methods and practices
  • Makes recommendation on and evaluates new technology

Management

  • Little to no duties

Project Management

Role

  • Major project owner
  • Facilitator and responsible for achieving deadlines


Tools

  • High level knowledge


Analysis

  • No analysis


Technology

  • Understands concepts and main drivers

Management

  • Manages major projects
  • Overseas all resources contributing to projects

Wednesday, July 15, 2009

Talent Recognition


















Washington DC Metro Station on a cold January morning in 2007. He played six Bach pieces for about 45 minutes. During that time approximately 2 thousand people went through the station, most of them on their way to work. After 3 minutes a middle aged man noticed there was a musician playing. He slowed his pace and stopped for a few seconds and then hurried to meet his schedule.

4 minutes later:
the violinist received his first dollar: a woman threw the money in the till and, without stopping, continued to walk.

6 minutes:
A young man leaned against the wall to listen to him, then looked at his watch and started to walk again.

10 minutes:
A 3 year old boy stopped but his mother tugged him along hurriedly, as the kid stopped to look at the violinist. Finally the mother pushed hard and the child continued to walk, turning his head all the time. This action was repeated by several other children. Every parent, without exception, forced them to move on.

45 minutes:
The musician played. Only 6 people stopped and stayed for a while. About 20 gave him money but continued to walk their normal pace.
He collected $32.

1 hour:
He finished playing and silence took over. No one noticed. No one applauded, nor was there any recognition.

No one knew this but the violinist was Joshua Bell, one of the best musicians in the world. He played one of the most intricate pieces ever written, with a violin worth $3.5 million dollars. Two days before Joshua Bell sold out a theater in Boston where the seats averaged $100.

This is a real story. Joshua Bell playing incognito in the metro station was organized by the Washington Post as part of a social experiment about perception, taste and people's priorities. The questions raised: in a common place environment at an inappropriate hour, do we perceive beauty? Do we stop to appreciate it? Do we recognize talent in an unexpected context?

One possible conclusion reached from this experiment could be:

If we do not have a moment to stop and listen to one of the best musicians in the world playing some of the finest music ever written, with one of the most beautiful instruments ....

How many other things are we missing?

Thursday, July 9, 2009

Your Analytics Widget

My stats on 07/09/2009

My Profiles Twitter
Followers
Flickr
Views
Slideshare
Views
YouTube
Views
bosilytics 562 - - -
yas_dev 11 3 5 -
stevebash 49 - - -
mj 2669 - - -
stevebash001 - - - 0
sbashiri234 - 14 - -

Collect, Store, Share, and Download your own stats at www.YourAnalyticsSite.com

Saturday, November 8, 2008

Robbed in the Daylight

Online businesses are losing billions of dollars every year to fraud. Businesses in finance, ecommerce, social networking, and other verticals are being targeted everyday by individuals that have made it their job to steal from these businesses. These are individuals (fraudsters) with skills ranging from novice to highly skilled programmers, systems and networking experts that can hack into businesses and steal. The damages they cause range from stealing identity, steal products, to stealing the customers. With sophisticated programs they appear as normal customers as they engage in attackes on a business. Most businesses are ill-prepared and have little or no plan of action to combat these types of challenges. The ones that do have process in place to deal with this type of attacks deal with them in a reactionary way. They plug the hole after it’s discovered. After it’s done some damage. By monitoring and analyzing data, fraudsters can be detected before they can inflict significant damage.

Methods
A fraudster may create multiple accounts on a social network site. By multiple I mean hundreds and sometimes thousands of phony accounts. This obviously would be difficult for a human being to do, manually. So the fraudster will develop code to do the work automatically. The code would automatically signup accounts on the web site with bogus information and create user profiles. These profiles will attract legitimate users that will offer their information in hope of connecting with others. This information can sometimes be very personal. But mainly, the fraudster is interested in collecting account information to sell to other businesses or trick users to join other online services. This scam could be collecting the customer’s email addresses which they can use in Phishing scams. A Phishing scam is when a fraudulent web site poses as a legitimate web site to collect information from victims, information like their credit card number or online credentials to other web sites.




Detection
An online business will need to collect and analyze data to detect fraudulent activities. The sophistication level of fraudster’s method can make it difficult to detect their behavior. However, by informing themselves, businesses can review their online data for unusual visitor activity and investigate the cause. This is done by learning the patterns in their data. Every business will have its own unique patterns of visitor interaction. One of the common methods of detecting fraudulent online activity is to look at large amount of activity by a single IP address, in a short amount of time. This could mean a fraudster is running a robot program from a single computer to perform attacks. One has to make sure the IP address is not a proxy IP address which at times represents many different individuals, possibly legitimate ones. The activities to look for can be further segmented to focus more on the high valued ones. These can be account signup, login, and sending emails to other users.

To look more like a legitimate visitor, a fraudster’s attack can come from multiple computers spread across a large geographic location, or at least across multiple IP addresses, making it harder to detect their attack. Fraudsters with more resource at their disposal can hop from location to location and have banks of computers and modems to avoid detection. Like a business, they optimize their code and methods to be more effective.

Prevention
A common method of prevention for online fraud is using CAPTCHA. Wikipedia’s definition of CAPTCHA is: “type of challenge-response test used in computing to ensure that the response is not generated by a computer”.
These are sometimes images of single words or phrases that are morphed and distorted so a human can read them but it would be difficult for computers to decipher. These images are placed on web sites where visitors signup or log into their account, or perform some high value actions. The visitor is asked to type in what they see before proceeding with the action. Bellow is a sample of CAPTCHA image you might see on a web site.






There is no substitute for prevention like educating your visitors/ customers on fraudulent activities that they might get subjected to. Businesses need to regularly communicate with their visitor community on what to watch out for, in relation to their site, and not become a victim of online fraud.

As the Internet has evolved and continues to evolve, so has the online fraudulent and criminal activities. Businesses that have not been paying any attention to this area will need to start engaging now. Chances are they are being subjected to fraud in one form or another. To start with, the online business will need to stay vigilant on educating themselves on different forms of fraud. They would then need to develop process for collecting and analyzing data that would provide insight into possible fraud activity. Next, there needs to be tool(s) put in place to support the designed processes. These can be software available online or developed by business's own resources. There are also companies that provide services to combat fraud.

Fraud management should be treated like most other business processes management. It needs to have its own life-cycle (Education, data gathering and analysis, detection, and prevention). First, learning what are the different fraud methods being used, then through data analysis fraudulent activities are detected, then prevention measures are put in place (plugging the hole), and then the process begins all over again. This represents a continuous cycle. Those involved in managing fraud need to understand that, just like pests in the house, once you see one or two cockroaches, there are probably hundreds or thousands lurking in places that you can’t see them.




Steve Bashiri

Sunday, September 21, 2008

WebAnalytics

As my first post, I though I write about some hot technical best practices in web analytics. But instead, I thought it best to write about a more fundamental question. One that has been shaping this industry for nearly thirteen years. What is Web Analytics? Well, easy. It's the practice of measurement and monitoring of all that interact with an organization's web properties.

There are two important aspects of this definition. Measurement and interaction.
But what are these measurements? Are these the campaigns, purchases, sign ups, downloads, and conversions? Or page views, visits, and visitors? The answer is all of the above, and much more.


And how about the interaction? The interaction we measure is from visitors. And by visitor, I mean the broad meaning of visitor, which includes crawlers, robots, and spiders.

Until recently, web analytic vendors have been developing tools that measured content navigation. whether it being campaigns, purchases, or conversions, it mostly dealt with granular interaction with content. In the ten to fifteen years life of web analytics, it's only been in the recent years that vendors have started measuring the interactions from the visitor's perspective. From voice of customer to customer experience to performance and up time monitoring, all have entered the web analytics arena. These tools shift the focus from purely content analysis to visitor analysis. Measuring visitor's experience has become popular with online marketing professionals.

Now the question is, whats next for this evolution? What will we measure next?

Steve Bashiri