Monday, February 4, 2019

Standard Directory System

STD -- Standard Directory


The 'Standard Directory System' has been in use in various forms for more than 30 years.The principal aim of the system was to avoid collisions. That is, it was intended to allow files that are different from one another to never overwrite one another. This was accomplished by having conventions for the names that are used for various common programs.As the system has developed, the system is made to organize things so that they are easier to find. This is accomplished by having conventions as above for particular programs and general conventions for particular types of programs and conventions for standardized directory structures. The standard anticipates various types of common requirements and assigns standard names to those requirements.

Standard Directory Names

Where possible, conventions that already exist have been followed. In some instances, such as the 'usr' directory under UNIX, it has been decided to deviate from the convention so that collisions are avoided. In other instances, such as the 'home' directory, the convention has been followed to allow easy transition back and forth between the systems.

General Conventions

Three letter name sizes

As a rule, directory names are only three letters long. There are several reasons for this: People generally find 'TLAs' easier to remember. By making most 'standard' names the same size, other directories are easy to spot in a listing Organizing large disks can require very deep structures. Using long names often leads to over-runs in the size of the name. This is particularly true for various target media. For instance, some media limit the total path length to only 63 characters. ISO 9660 CD-ROMs have a maximum path length of 207 characters. By making the standard names only three characters long, we free up more space for other software that does not follow the standards. For instance, here is a real pathname as constructed by other software and its equivalent as a 'standard directory name':
  1. C:\Program Files\Microsoft Small Business\Small Business Accounting Addins\Fixed Asset Manager\Templates
  2. C:\std\app\mcs\msb\sba\fam\tpl
Example 1 is 104 characters long. Example 2 is only 30. Should there arise an occasion (as there does in the real world) where one would want to copy a backup with the full directory structure into the bottom of the path, it would exceed the length allowable on most CD-ROM's and it would not be possible to copy it there. Worse, many file systems will allow you to write a path that exceeds their limits and then not allow you to read or delete that path.Experience with real world systems where entire structures must be written deep into other directory structures shows that problems arise quickly if very long names are used. Since we can't control the use of long names in other systems, we are doubly obliged to keep our own usage to a minimum.

Simple Names

The standard specifies that names should be, as much as possible, formed according to simple 'lowest common denominator' rules. This is in keeping with the principal that we are attempting to avoid 'collisions'. In this case, the collisions are with the conventions of a given operating system. That means: No spaces, use underscores if needed. No characters that conflict with operating system shells -- example: '(', ')', '[', etc. Generally use lower case letters to remain compatible with case-sensitive systems such as HTTP. If practical, there is still a bias toward 8.3 DOS style file names, especially with files that may travel to many operating environments -- example: readme.txt As a general rule, the name should be able to retain its exact structure across all environments where it is likely to reside.

Some specific names

Root directory

The root directory currently in use is 'std'. This is not particularly important and for anyone using the system outside of ones created by us, any three-letter root that does not collide with conventional directories would be fine.Standard Sub-Directory NamesThis is not an exhaustive list and it changes from time to time as standards change.std -- Root Directory Root directory as currently used. This can reside on any drive or in the case of UNIX would hang off the regular root. Example: C:\std\
  • app -- Applications
  • app -- application directories
  • arc -- Archives
  • arc -- Archive Files -- Not quite the same as the backup directory. This directory merely contains archives of files that are taking up space but still may be needed from time to time. It is also a 'scratch area' to allow archiving temporarily to free up disk space.
  • bin -- Executable Files
  • bin -- binary executables and scripts.
  • bkp -- Backup Files
  • bkp -- Backup files and related files. These are intended for actual backups. What type of backup resides here depends upon the location of the directory.
  • cyg -- Cygwin
  • cyg -- Under Windows environments reserved for Cygwin.
  • dvl -- Development
  • dvl -- Software Development directories.
  • doc -- Documents
  • doc -- Documents -- *.doc, *.txt, *.xls, *.ppt, etc.
  • home -- User Home Directories
  • * home -- home directory -- retains UNIX convention for home directory.
  • hme -- home directory -- home is deprecated. By making user homes different from both Windows Logins and UNIX logins, it makes the system more 'OS agnostic'. It also does not suffer from the clutter put in home directories by well-meaning admins. Copying /std/hme/nzt/doc from one system to another should not interfere with anyone else's conventions.There is also the matter of symmetry. Branches should generally be three letters. That seems to work best in practice and it makes directory trees easier to visualize, less tedious to type, etc. Leaves can (and arguably should) take longer names thus creating: /std/hme/rst/dvl/vb6/tst/TestTrackBall would be perfectly valid. The path above breaks down as :
  • std -- Standard root hme -- Home directories rst -- Robert Stephen Trower's home directory. dvl -- development vb6 -- vb6 specific development tst -- test programs TestTrackBall -- Working directory to Test TrackBall code.
  • inc -- Include Directories
  • inc -- include directory -- contains various types of include file. This is generally different from the C-Language standard 'include' directory.
  • lib -- Library Directories
  • lib -- Libraries. This typically contains *.lib or *.o or *.obj files.
  • std -- std 'reflection' directory
  • std -- Under C:\std, for instance, this would be C:\std\std. It is used as a 'reflection' of a standard structure that might, for instance be the subject of a network share.
  • svn -- Subversion
  • svn -- Subversion Revision System
  • tmp -- Temp files
  • tmp -- Temp files. This is especially used by things such as session variable.
  • trn -- Transfer files
  • trn -- Transfer directory. When files are being transferred into or out of the system, this is where they reside during transfer.
  • web -- WWW files
  • web -- Root of things related to a local web server.
  • wrk -- Work/Scratch files
  • wrk -- Many things done on a file system involve things that are experimental, have a limited lifetime or require some thought to create a permanent home. The 'wrk' directory is designed to give a quick area to do work that will not interfere with the rest of the system.

Standard Path Variable

The 'standard path' is formed so that the commands available depend upon your context within the file system. For instance, there may be utilities specific to one set of files that do not apply to another. This ensures that when appropriate, the commands are on your path, but does not require your path to include everything on the disk.
The standard path usually contains something similar to the following:

Standard Paths

Absolute Paths

Personal Command directory (over-rides all) C:\std\home\myname\bin; Global command directory; C:\std\bin; Root command directory \bin;

Relative Paths

local command directory bin; Parent command directory ..\bin; grand-parent command directory ..\..\bin; great-grand-parent command directory ..\..\..\bin;The above forms a path similar to this: path=C:\std\home\myname\bin;C:\std\bin;\bin;bin;..\bin;..\..\bin;..\..\..\bin;

Operating System Paths

Other paths are required as well, such as: Adjunct OS path (Cygwin) C:\std\cyg\bin; The operating system path: C:\WINDOWS;\C:\WINDOS\SYSTEM32;It may also be convenient for certain things to be usable globally from the command line:Application Paths Application path C:\std\svn\bin;

Putting things together in the intended order gives a path that looks like this:path=C:\std\home\myname\bin;C:\std\bin;\bin;bin;..\bin;..\..\bin;..\..\..\bin;C:\std\cyg\bin;C:\WINDOWS;\C:\WINDOS\SYSTEM32;C:\std\svn\bin;

As can be seen above, this is already a substantial path. However, it allows a very wide range of commands to be available while working and leaves room for the many applications that add their own paths during installation. Because many of the commands are on relative paths, the search for a command is much faster than it would be if all application commands were on the path at once. The bin;..\bin;..\..\bin; relative path construct also allows the placement of commands to show a finer granularity and greater specificity to the task at hand.

By placing everything that pertains to a given system within its own directory structure, backup, removal and restoration of backups is simple and has a minimal impact on the rest of the system.

Relative paths also allow the simple relocation of directories and also allow the creation of completely operational copies of the directory in other places. It is possible, therefore, to try a radical global change on a group of directories on a copy rather than the original.

Gun control

People with guns are definitely killing people. 

The U.S. second amendment is clearly, should you look at its historical context, meant to guarantee citizens the right to bear arms precisely to resist an over-reaching state. Emphasis mine:

"In a 5-4 decision, the [United States Supreme] Court, meticulously detailing the history and tradition of the Second Amendment at the time of the Constitutional Convention, proclaimed that the Second Amendment established an individual right for U.S. citizens to possess firearms and struck down the D.C. handgun ban as violative of that right. " --

The U.S. Constitution is the instrument whereby the Citizens of the United States delegate the partial exercise of *their* sovereign power. That instrument specifically forbids disarming the citizenry. The interpretation that 'militia' somehow puts that power right back in the hands of the state is without merit. 

Because it is so clear in protecting individual rights, the U.S. constitution acts to some extent like a proxy for what is the sensible intent of a citizenry as a collective. 

Biotechnology Malfeasance

Below is a submission that I made to CBAC. This was a cynical process that was dressed up as a public consultation but was in fact a 'pro-forma' exercise to rubber-stamp a decision already made without public input. As it turned out, public consultation showed that the public was near universally opposed to the legislation contemplated and ultimately passed. 

Comments on the document: Interim Report of the Canadian Biotechnology Advisory Committee to the Biotechnology Ministerial Coordinating Committee

The original source of the document reviewed was here: This has since been removed. However, I found what appears to be the same document archive here:

About me:

I make my living as a software developer and researcher. I have a Bachelor of Science degree (major in Biology). I have no particular vested interest here, other than as a member of the public. I hold a scientific world-view similar to 'the received view of science'. I am somewhat biased in favor of scientific advancement. I am hardly a 'tree-hugger', or an 'eco-nut'.

My concern with respect to the matter at hand is grounded in my understanding of the following:

a)       The opportunistic nature of biological organisms and evolution
b)       The geometric growth of living populations
c)       The insular nature of bureaucracies
d)       How lobby groups influence the public agenda
e)       Intellectual property issues
f)        How answers depend on the particular questions asked

Additionally, I did a little research to look at what others are saying about the various issues involved and looked at some news stories surrounding specific related events.

General Comments:

The CBAC document surveys many different aspects of the subject at hand. Some valid points are raised and an effort has been made to make the document at least seem inclusive. However, the document seems lacking in some important ways. I feel the following are flaws that should be addressed in a final draft:

1)       The potential for catastrophe is not properly treated in this document, even though this is really the heart of the matter. This appears to stem from a misunderstanding of the distinction between biological material and non-biological material. Living things migrate, self-replicate, interbreed and evolve. Current technology can not cure the common cold or safely rid my lawn of weeds. What does this say about our preparedness to deal with a virulent biological threat?
2)       Important recommendations and feedback regarding caution and safety concerns are buried in the report and appear to be 'soft-pedaled'.
3)       Trivial concerns and considerations are presented as if they are equal to very important concerns. This is particularly true with respect to safety, sustainability and public welfare.
4)       Liability issues are ignored entirely.
5)       "Transparency" is dealt with in an abstract way but it is unclear that the process has been effectively transparent. Cursory research shows that the document in question is not a balanced reflection of the views of all stakeholders.
6)       The problems inherent in current intellectual property legislation are not adequately dealt with. See the appalling decision by the federal court here: [Note original is no longer on the web, but has a copy here:] [original no longer available was:] I am certain that informed Canadians would object to this outcome. The judge's decision is soundly based on principles of law. However, the legislative framework and precedent surrounding this is bad.
7)       The uncontrolled spread of biotech material, equipment and expertise might lead to deliberately created bio-weapons. The report does not address the issue at all.
8)       Specific examples from past experience are missing. There is no mention, for instance, of StarLink. Why not?

My general impression of the document is that it is heavily influenced by partisans whose aim is to allow GM, GE products to enter our biosphere for profit. This is troubling in the extreme. It is only at this point in time that we will have the opportunity to decide whether or not it makes sense to let the genie out of the bottle. Once it is out, we may not be able to put it back in. The CBAC document ought to be more balanced and forthright in presenting the issues and recommendations.

Here is what I would like to see:

1)       Get some real biologists involved. We are talking about living things here. They differ markedly from non-living things. It is clear from the content of the document that there is a profound lack of understanding of the importance of the mechanisms of both biological evolution and ecology. GM foods absolutely do not belong under the umbrella of 'novel foods'.
2)       Ban FrankenFoods not already approved.
3)       Mandate proper liability insurance coverage for FrankenFoods in use.
4)       Mandate proper liability insurance coverage for research facilities.
5)       Mandate a "Roll-back" plan. Any FrankenFood introduction should have a back-out strategy that completely undoes the introduction. This includes the costs of removing the mutant material from the ecosystem and repairing any damage done. If you can't afford the cost of the insurance, you can't afford to do it.
6)       Place proper, conservative legislation around FrankenFoods in use.
7)       Fund research to assess risks properly.
8)       Legislate safety standards for research such that material used by and generated by research activities can not escape into the environment. This should include controls on access to advanced equipment and materials that could be used to develop biological weaponry.
9)       Fund an advocacy group to persuade other entities (nations, multinationals) to refrain from doing anything that will place their GM, GE products from crossing our borders.
10)   Legislate severe penalties for vested interest parties found to subvert the process of this public debate. This should include jail time, disgorgement and punitive damages. "Whistle blower" legislation would be good and perhaps bounties to ensure that there is some incentive in place to find and prosecute those who would injure the public good.

Specific comments regarding document content:

Re: 8.1 "The Panel recommends the precautionary regulatory assumption that, in general, new technologies should not be presumed safe unless there is a reliable scientific basis for considering them safe. The Panel rejects the use of “substantial equivalence” as a decision threshold to exempt new GM products from rigorous safety assessments on the basis of superficial similarities because such a regulatory procedure is not a precautionary assignment of the burden of proof".

The quote above appears on page 63. It should be right up front, stated more plainly and should be the central thrust of the report. I believe that these technologies should be assumed unsafe and banned initially.

Re: Section 5.4: "This does not imply, however, a zero-risk approach. … Under circumstances where it is appropriate to use substantial equivalence as a framework to structure the safety assessment of novel foods, it is necessary to ascertain whether the composition of the plant has been changed in any way. "

When we are talking about potential catastrophic failure of our environment, our food supply and life itself, I would say that a 'zero-risk' strategy is the only acceptable standard. Is there a reputable biologist not co-opted by the bio-tech industry that accepts 'substantial equivalence' as a meaningful concept? I do not accept the 'substantial equivalence' concept at all.

There is no discussion of how liability will be dealt with. Say a particular FrankenFood vector wipes out a vital food species, kills scores of people, is responsible for genetic infirmities in people or livestock, destroys eco-systems, etc, etc. Who pays? Should we not work to establish this? I would like to see a comprehensive liability insurance scheme in place. It should be funded by all FrankenFood beneficiaries. It should absolutely cover any and all liabilities arising from the introduction of FrankenFoods into our ecosystems.

Re: "The sessions were designed to achieve a balance of representation from the general public, society, industry, research and academia. However, some representatives from civil society, primarily environmental nongovernmental organizations, chose not to participate, thus diminishing the representation of this group. "

This discussion, as framed by CBAC, has been similar in kind to a discussion about a regulatory framework that assumes the exploitation of children. You start with the assumption that something unthinkable will be done. We are now talking about the mechanics of legislation. It is fairly clear from your document that you do not have the benefit of the following point of view: GM, GE foods could be exceedingly dangerous on a scale that makes their introduction unthinkable at this point in time.

It is troubling that we are blithely discussing a 'regulatory framework' for FrankenFoods. Due to the actions of large integrated companies in the food industry and regulatory failures of other political jurisdictions, we may be obliged to accept FrankenFoods. However, this should be specifically addressed as such ("this is why we are forced to…") and strategies to combat this should be examined.

Re: "The petition presented by these representatives, as well as CBAC’s response, can be found on the CBAC Web site".

I could not find it there. In fact, it took a while to track it down here:

I was troubled by the fact that the response seemed to be tangential to the questions raised and appeared to be carefully avoiding the substance of the petition itself. Even though I had not read the original petition, a careful reading of the response led to the conclusion that the response was meant only to deflect the petition. The response ignored the substance of the petition. Here is where I found the petition itself.

It is chilling to read both the petition and the government response. It is equally chilling that such a substantive treatment of the issues CBAC should be addressing is dealt with in such an off-hand way in the CBAC document. It is curious that you say 'can be found on the CBAC Web site', but do not offer the location in your document.

"CBAC will take into account the feedback from Phase 3 to produce its final report and formal recommendations on the regulation of GM foods."

What does the above actually mean? My cursory inspection of available sources on the internet show that for the most part observers who do not have a vested interest in bio-technology universally disagree with the headlong rush to embrace GM foods. It would be difficult to get that message from a reading of the current report as proposed by CBAC. CBAC seems to already be in the 'pro bio-tech' camp. To the extent that arguments respecting caution are acknowledged, the posture of the document tends toward condescension and the 'handling' of objections.

"This report will be delivered to government in early 2002".

This report and any legislation pertaining to GM foods should definitely be debated in the House of Commons. Hopefully the final report will be a little less cynical, contain a little more substance and address the unique biological issues involved with GM foods.

The reports referral to the 'novel food' concept is troubling. It conveys a lack of understanding of the very unique specific dangers presented by GM foods. These dangers transcend the dangers of an otherwise 'novel' food. I do not worry that a novel source of fiber in the food supply will go on to destroy the ecosystem, create a new virus or pest, end up being incorporated into the genome of my children, etc. The ultimate danger of GM foods is to collapse the ecosystems upon which life itself depends. To blithely insist that GM foods are comfortably on a peer basis with other novel foods is to miss the point entirely. Whatever the likelihood of risks involved with GM foods, those risks are potentially catastrophic. A novel packaging that subsequently turns out to poison, kill and maim might get a couple million of us at worst. GM foods could potentially kill us all.

Re: "Scientists developing products of biotechnology do their work in labs, growth chambers and/or greenhouses. In these settings, the products are contained and should not come in contact with the environment. These activities are not currently regulated under the federal system. The Canadian Institutes of Health Research have guidelines for working with genetically modified organisms. Most research institutions — both public and private — also have their own codes of conduct and oversight committees for biotechnology research."

Right. We need regulations that clearly enforce the requirement that GM material does not escape. Further, we need to address liabilities and put in place the financial infrastructure to deal with any breaches that impact human welfare.

Re: "This means that plants produced through biotechnology are grown under conditions aimed at preventing the transfer of pollen to other plants; they are monitored by the experimenter and CFIA field inspection staff; and the trial site is subject to post-harvest, land-use restrictions and further monitoring."

Have these methods not failed to achieve their purpose in the past? Why does your report not specifically address these instances and give an analysis of the dangers posed? Current techniques of sequestering GM plants and the legislation governing this are both wholly inadequate in my opinion.

Re: "The question of commercial secrecy also arises in the debate over the government’s transparency. The desire of companies to maintain the confidentiality of data or information that they consider “commercially sensitive” has some impact on the degree of detail the regulator can provide in communicating information to the public. It also raises the question of who should determine what is commercially sensitive information".

If commercial interests make a pact with the Devil to arrive at their FrankenFood, we need to know. Absolute disclosure should be mandatory with proper peer review and the opportunity for interested parties to comment intelligently. Informed consent is an absolute must. How can we give informed consent to something about which we know only partial details?

The Network Endgame

I believe that Facebook remains a significant threat to not only Google, but most of the other players in cyberspace as well. In fact, right now, Facebook is *the* significant threat. If I were in charge of Google I would be working furiously to contain this threat while there is still some chance. We may be past that point already. In the absence of any real competition and barring any serious misstep on their part, Facebook will swallow its competition on all sides.

There will be one dominant player in cyberspace and it will be a social network. Right now, Facebook is the one to beat. It reached critical mass at least around the 500 MAU mark or so. This is an inevitable outcome of the mathematics of 'Group Forming Networks'. The compounded force of the interlocking networks makes Facebook extraordinarily sticky. In fact, mathematically, the acceleration toward the center of such a network exceeds that of gravity.

For Facebook, the battle has been all but won and it is now in the endgame.

Amongst other things, Facebook has:

- relationships with billions of people
- data on most of the active users in cyberspace
- a captive audience
- much more of the world's end-user data (photos, etc) than anyone else
- an extremely 'sticky' and interconnected network
- active support for 'group forming networks' (GFNs)
- the largest (by far) network of GFNs
- committed users
- very deep data on their users

The above assets are near impossible for another company to duplicate and as long as they are not disrupted somehow, Facebook will continue to accrete 'social mass' and increase its network 'density'. The ease of disruption is inversely proportional to the nth power of the number (n) of people in the network. At over 2 billion people, that value (1/(2^n) has become very, very small. It is so small that a network attempting to compete 'apples to apples' has virtually no chance of succeeding. To disrupt this network now will require seriously novel thinking.

As a particular example, the way to monetize page-views (and hence search, blogging, tweets, online content, etc) is through advertising. Google must cast a wide advertising net to facilitate a sale. Advertising on Facebook can be targeted much more effectively. Ultimately, the source of funding for advertising comes from sales. As long as a dollar in advertising results in more than a dollar in net revenue for the advertiser, the dollars will flow. Google cannot guarantee that your advertising dollars will create net profits. Because Facebook knows much better who will buy, when and why, they can already offer a much greater bang for the buck. Within a few years (probably sooner), they will likely be able to demonstrate that their advertising services increase net revenue -- the net profit associated with the advertising will exceed the cost of advertising, making advertising on Facebook a necessity.

Google still has search, so they will still have eyeballs. However, as more of cyberspace falls into Facebook, Google can provide search for less and less of the internet. Search is only going to draw people who are looking for something. If they already know where it is (on Facebook), they will not visit Google. This is especially true as more relevant content moves into Facebook and effectively out of the reach of Google.

Google has a lot of data on who searches for what, when and likely why. They also have a lot of data on volumes of search requests. No doubt they have some methods of determining how much of their audience is being captured and held by Facebook. The fact that Google, the company with the most intelligence on cyberspace prior to Facebook is worried means they must have seen something that worries them.

Any company that is threatened by Facebook (and I think that is most of the players in cyberspace) needs to first of all understand *why* Facebook presents such an extraordinary threat. It is not clear to me that even the people at Facebook realize why they have grown so spectacularly.

The best hope that competitors have is that Facebook stumbles badly and creates pressure for their user base to move. Failing either a dramatic challenge from a huge player like Google, Amazon, Microsoft, Oracle, HP, IBM, etc, government interference or a spectacular stumble by Facebook, the game has already gone to Facebook.

Excepting government interference, I think that Google, Microsoft and Apple have the best chance to disrupt Facebook. However, their window is rapidly closing. People and entities like companies, charities, clubs, etc. that already have data, a comfortable presence and network linkages on Facebook have no incentive to move away from Facebook. They need to be enticed away by something more compelling. As their presence on Facebook grows and is increasingly bound by linkages into the Facebook ecosystem, the ability to move them diminishes rapidly.

Facebook vs Google

Facebook presents unique challenges to Google. In my opinion the game has gone to Facebook. The only thing that can stop it is some combination of:

1) Government interference
2) Aggressive action by big players like Google, Apple, Microsoft, Amazon, etc.
3) Big missteps by Facebook management
4) Dramatic sabotage
5) An aggressive, disruptive competitor

Many appear to underestimate the threat presented by Facebook.

People are motivated to 'act' when they use a search engine looking to buy something. They will be more receptive to specific relevant advertising. However, Facebook is increasingly in a better position to know who will buy and when. Search engines have a query string and perhaps a little history. As they refine the system, Facebook will have much more. They will only put ads where they will be most effective. An advertisement on Facebook will ultimately anticipate the search before it even happens. You will eventually not be able to present your ad because no search will take place. Both the search query and the data it is looking for will be on Facebook and you will be out of the loop.

In the Facebook universe, I do not have to turn to a search engine to find out what folks around me are doing. It is scrolling by when I log in. As the Facebook relationship grows, they will anticipate more and more of what I might be interested in seeing and they will present it to me before I ask for it. Google is an excellent search engine, but it takes at least a dozen keystrokes or more to do any non-trivial search. It also requires me to shift attention away from what I am doing. On small-screen devices, it means I have to essentially switch applications to move off to do a search. As Facebook matures, it has the following advantages:

1) It has the user's attention already. If you are not Facebook, you have to get the user's attention first.

2) It has rich context beyond just a query string. It knows the user's habits and recent past, it knows this for friends, family, cohorts, demographic groups, etc. It gets real-time updates to this, so it 'knows what is going on'. [Network forces are cumulative. Here is one particular advantage this deep context confers on search: it can 'auto-up-vote and down-vote' pages. The user is more likely to see the results they want. If everyone in your social circle is looking for that dance video, Facebook can anticipate the search request and place the link on the page before it is even requested.] Given time, Facebook will know where you are, who you are with and how long you are staying. A search engine might use geocoding to guess where you are and use 'cheats' like persistent cookies to steal data otherwise.

3) It can access the full universe of data inside and outside of Facebook. Search engines can only access data visible outside. If I am looking for that picture of my friend's cat it is not even in the search engine index.

4) It already has relationships with buyers, sellers, intermediaries and other interested parties. Its relationships are persistent and interconnected providing much more intelligence.

5) It knows the user and has enough information to determine 'worthiness' (credit, health, credentials, social status, etc)

6) It can persist unlimited data specific to users, buyers, sellers and their relationships.

7) Killer Facebook feature: mediated promiscuous data. There are ways to provide everything a vendor needs to offer a good quote without compromising user privacy in any way. Users can store private information without worries that they lose control.

8) Function is embedded in a 'live' application. A search in Facebook is a request to a framework, not a search engine.

9) The user has a 'vested relationship' with Facebook. Gaining access to private personal and financial information, contact information, etc is easier for Facebook because they already have a relationship capable of trust and they have ongoing storage related to the user.

10) Facebook already has experience and expertise with infrastructure at scale. This can allow very rapid expansion of capabilities that can entirely take competitors by surprise. They can consolidate before competitors can respond.  

There is no doubt in my mind that Facebook is investigating what they need to do to compete in all the major categories. If they are not, they soon will be.

Right now, Facebook has a nearly insuperable advantage. They have the user's attention. Right now, it is possible to directly monetize that attention through advertising and they are doing that. However, the really big score is the deepening commitment of Facebook users. As these users continue to accumulate data on Facebook, their relationship with Facebook deepens in ways competitors simply cannot match.

Network effects have taken hold at Facebook and since that is the current center of gravity the positive feedback loop is irresistible. In fact, I believe that the only thing preventing gravitational collapse right now is Facebook itself.

I think a search company has a good chance of unseating Facebook. However, it needs to focus on the correct goal. Once Facebook institutes a usable search on their site it will be extremely difficult to compete with them in the search space.

Here are a few 'quick hits' -- ideas that I would have in mind if I were doing a search engine competitor to Facebook:

1) Providing a high-quality search is key. Bean counters will move in and suggest how you might 'improve' the business. Resist them. For now, the business is Search and you know more than they do.

2) Users are your partners. Keep faith with them. Going forward, you should look for ways to share your success with your users. This may seem strange, but users create equity and ultimately they can snatch it away. Keep them from taking that equity away by sharing with them fairly. One idea that might have merit is to allow them to participate in some preferential way in the actual market equity of the company. Perhaps users could accumulate rights in something akin to warrants to buy shares in the company. When I say 'fairly', I mean that you should, for instance, protect their privacy.

3) Lean really hard on getting into user's browsers as the default search engine. This is an instant win and if you are at least as good as the other search engines, you almost win the battle (not the war) right there.

4) Realize that social networking is the ultimate killer app. Search is your entre, but providing users with a permanent home should be your goal. Users will go back to the location where they find their friends and family and their notes and pictures and messaging system, etc. The dominant social networking player will have all the time in the world to perfect search, applications like mail, online sales, news feeds, etc. If you compete in a 'niche' like search, you must be the best and stay that way. If you are the dominant social network, you just have to 'be'.

5) Your true competition, Facebook, is still vulnerable. You can beat them, but not for long. It could render itself invulnerable within less than two years. While that window is still open, a killer search company is the best bet for replacing them. It is not an option to just stick to your search niche, though. Eventually, the dominant social network will take that over. You need to gain traction with your niche, but make no mistake, they will definitely close the search niche.

It may be that the game is already over and Facebook has won. If that is the case, you should approach them as soon as you can to sell them the company or at least license the technology. Were you to join forces with Facebook today it would probably be a 'Google killer' and Facebook might well be willing to share significant equity to dispose of a potentially powerful rival like Google.

There are many nuances in all of this. For instance, keeping faith with users may be one way to steer clear of government interference.

Facebook Top Ten

Top ten things facebook should do

I expect they are in the midst of this stuff anyway. The overall strategy is to ethically join forces with users, suppliers and advertisers against the competition. Do a 'grand slam' attack sweep of all low hanging fruit by leveraging their existing user relationship. Justify it by honoring the security and privacy of users.
1) Keep the pressure on to retain users. As long as they have the revenue necessary to survive they should not be injuring the quality of their system to chase dollars. Treat click-bait as spam and get rid of it.
2) Add a very high quality search engine. They have the ability to marshall server resources to crawl the entire web in short order. Just removing some existing annoyances would go a long way. Make it possible for a website owner to mirror their service to facebook's servers.
3) Create a truly killer advertising system by working *with* users to retain privacy and promote only things they are truly likely to want to see. Example: Coupons! So many great ways to make this type of thing work. Make advertisers compete in a 'top ten' offers race where half the people who make it into the top ten are not charged for the advertising.
4) Set up real-time Q&A that can tie into people's mobile devices. Work to link to Siri and Cortana, etc.
5) Provide premium streaming content -- music and video. Apple Music, Netflix, Spotify can be beaten.
6) Make a dead simple online IDE that makes programming against a facebook 'App' API easy.
7) Move users to the cloud. Create an arm's length joint custody secure information and messaging system that replaces EMail and Messaging with a hybrid that includes trustworthy storage.
8) Issue facebook charge card to any established facebook user that asks.
9) Bundle premium stuff for a no-brainer $5.99 per month fee. Existing incumbents are not setting the bar very high when it comes to respecting their users. Facebook is in a position to blow away the competition by *cooperating* honestly with their users.
10) Issue streamlined facebook browser based on webkit to tightly integrate all of the above and ... make it a no-brainer open source system that anybody can download, customize and build. You go to a link, download an install file and it sets up the entire build system with source code and an IDE that allows you to simply click a 'build' button and it builds.

Browser Check

What We Know About You IP Address: Loading... Browser Name: Browser Version: Operating S...