Sunday, December 15, 2019

What means convergence?

[This is from a posting I made originally on Advogato Aug 18, 2003.]

Have a picture of your mother in the shower? Want one?
What means convergence?
There has been a lot of talk about 'convergence', but I think that it has been short of the mark for the most part. Convergence is something that has driven my company for more than five years now. The world has been slow to converge, but the infrastructure is slowly gaining momentum. Convergence as I describe it here will happen and it has important consequences for us all.
First, let me explain what convergence means to me.
Background
Once upon a time, information flowed from word of mouth down through the generations. With the discovery of writing, this information could be kept in a more accurate form. With the invention of the printing press, this information could be widely disseminated to a large audience. Over the years, technology has allowed information to flow ever more quickly to ever wider audiences through the advent of newspapers, magazines, telephones, radio, television and finally networks, the Internet and the web. While these technologies matured, business models emerged to pay for them. Control of these various media outlets was gathered into the hands of a few entities and was sanctioned separately by governments.
Historically, all of the various different types of media have been considered separately and controlled separately. Visual communication via printing on paper, and electromagnetic telecommunications are largely incommensurate. As computing devices and telecommunications matured, it became technologically possible to merge most communications into one network. Vested interests in phone companies, cable companies, television and radio have fought tooth and claw to keep their government sanctioned (usually via licenses) control over their piece of the pie. Telecomm and Cable companies in particular have tried very hard to leverage themselves into becoming the pipe through which everything arrives. With the wide acceptance of the open standard TCP/IP and the development of the World Wide Web and its many open standard protocols, it is rapidly becoming physically and politically possible to merge (converge) some of these formerly independent communications media. This is being done. Voice over IP -- using the Internet to transmit phone calls is an example of this convergence.
I think that the majority of players in the know in phone, cable, radio and television realize that convergence will become a reality. The electromagnetic spectrum currently doled out in licenses to particular companies for radio, television and telecomm will eventually be 're-purposed' to the carrying of TCP/IP traffic. At least a variant of Ethernet/TCP/IP will be the lingua franca of communications. Tag language interchange of data at the application level will rule the day.
One thing that gets lost in the noise that is particularly important to us all both socially and politically is the convergence of electric power and the network. Two things prompted this article. One was the power outage experienced by the northeastern U.S. and Canada and a news story about a camera Sony will be releasing in October of this year. The camera will use the IEEE802.3af Ethernet standard to draw its power from the network.
What means convergence?
I can't say precisely when it will become dominant, but the following convergence is already happening: All communications and part of the power grid are becoming a single transparent worldwide network. Communications devices and their power requirements are shrinking. It is technically possible, for instance to produce a camera with a 360x360 degree viewing range that is connected to the Internet and near invisible to the eye. When mass-produced, these devices would cost pennies or less per device. Privacy as we know it will become almost impossible within my lifetime. Webcams? They will be everywhere. Light bulbs will be monitored via the Internet because it will be cheaper to produce them with the device than without. Take a look at the text on an Orville Redenbacher's popcorn bag sometime and if you understand what they are saying, you will agree with me.
Convergence and the Free Software Community What has all this to do with free software folks? Plenty. First, as members of the community at large, we all have a stake in this. Second, as the only cohesive group that can understand this stuff as a group, we have some duty to join the fray to ensure that privacy and freedom are balanced appropriately. Third, we have a particular vested interest in ensuring that the commons -- both the highway and the traffic -- are not misappropriated.
A Byzantine tangle of laws and corporate ownership already clouds the issues. As I write this, lobbyists everywhere are working very hard to add to that Gordian Knot binding what I believe properly belongs to the commons. The passion and intelligence of the free software community should be (actually is if you look at GNU and EFF closely) brought to bear to cut through the nonsense and vested interests to ensure that serving the public good becomes as inevitable as convergence itself.
How this affects one company
We (my company and some of its partners) have accepted the reality of convergence as described here for more than five years. We have acted upon some of the primary ramifications by working on components that we feel will be needed in the converged world.
One of our company's directions has been to do primary research into what we call 'data packaging'. This involves the creation of tools and the validation of assumptions as to the movement of data. We expect that network data will require the following:
Security -- (can be viewed only by intended recipient) Authenticity (can be authenticated as valid from a given source) Integrity (can be verified as unaltered) Reliability (can endure alteration of bits or bytes without compromising the above aspects) Privacy (can be authenticated, but not necessarily traced back to an individual or location). Parsimony (packets should be as small as possible to achieve above goals)
Protocols and standards for the above have matured during the years we have been involved in this research. However specific tools, infrastructure and wide adoption are still lagging. Our real focus has been on parsimony. We have been trying to come up with greatly improved compression since this has immediate value. We still feel this is a very promising area of research, though results to date have been disappointing.
Another one of our directions has been the development of a web business. This comes as a result of one of the subtle ramifications of convergence: The top priority of any business should be to establish and maintain a relationship with individuals. As convergence takes hold, only the relationships between individuals will ultimately matter. Everything becomes de-localized. Only human allegiance will bind clients and vendors together. Trust, in its many different meanings will become paramount. We have worked quietly to build trust within a small group of individuals and we have expanded that circle of trust slowly. As they come to realize that they can move to any vendor they want instantly, we hope they will stay with us because they trust us.
The web business also recognizes that the web is basically the on-ramp to the converged information highway. If we can be a trusted contact at the entry point, we get first dibs on any and all services that we can provide. These services will expand to include just about everything of value eventually.
I am not at liberty to say what, but my company and its partners are looking very closely at particular niches to provide services involving security and trust in the converged world.
How does this affect you?
As an individual, you will likely be living in the converged world. Get ready. Let your conscience be your guide. Anything you formerly did in private is about to become somewhat public. Think about that. Better either get real comfortable with things you do differently or start conforming to something you can defend.
Already, the Internet has a memory. It's no use trying to deny your past on the net. Somewhere there will likely be a copy of what you said or did. This can get real uncomfortable for the weasels. I like that. It is my hope that eventually we all get together to create a list of individuals hostile to the commons and deny them the services of the commons. I particularly like Advogato's trust metric research with respect to this. The weasels can say whatever they want, but who will listen? Eventually they will be known for what they are.
The current SCO assault on our community comes to mind. I think SCO will die an ugly death as a result of this. However, who will remember precisely what individuals were involved so that we can ensure they don't try again elsewhere? Eventually this type of information will be too ubiquitous to hide and the weasels will rarely get a chance to do any damage. Dilbert is based on reality. These guys are almost always multiple offenders. It would be a treat to have their record accumulate publicly.
Speaking of public life -- get used to globalization. It is a big follow-on from convergence. I expect that the global economy will eventually transition to a single currency. Already money and political power are linked and political boundaries are fading. Countries once had near absolute sovereignty over their borders. International agreements have changed that. Expect this process to accelerate. Geeks such as myself have already felt the pinch as our jobs quietly move overseas.
Finally, getting back to the network memory -- I really like the idea that records will accumulate. However, it comes with consequences. None of us is without sin. It is most important that people in our community (geeks who understand this stuff) work diligently to ensure that our transition to the converged, border-less, information rich world is sane and humane.

--------

Additional text added in reply to comments:

I apologize that the article was so broad and general. Time, you know.
When I said "Let your conscience be your guide" I was referring to the fact that the new converged world will strip you of privacy. You should be sure that whatever you do is something you personally feel comfortable defending if it becomes public. That is because physical privacy of your image and words will become difficult or impossible. Chances of behavior becoming public will likely be inversely proportional to its novelty, proportional to the status of the individual, inversely proportional to the social distance between yourself and an observer, etc, etc. The tag line about your mother and the shower is an old joke that is about to become literal reality. I have seen hardly a mention anywhere of this obvious, inevitable outcome of coming technology. I don't have anything to hide, but even I am squeamish about being watched in the bathroom ... it's creepy.
Re: The public good -- what is it? What maximizes it? Somebody will be answering these questions and if they are the only voice heard, that's what we'll get. Frankly, the extension of copyrights, DMCA, the notion of 'Intellectual Property' (as if you could make such a bag and stuff Patents, Copyright, etc into it), FrankenFoods, etc. all act AGAINST the public good in my opinion. However, my opinion was never really heard. The above odius things have been legislated as in the public good thus far. In the converged world where it is feasible for the government to track your every breath, they likely will unless we insist on something else.
Re: Convergence not as wide ranging as suggested. I emphatically disagree. The networks will inevitably converge and they will be attached to just about everything. Guns and Bullets? The really dangerous stuff is probably already part of a network somehow and that trend will accelerate. How do guns and bullets get deployed and used? Somehow messages went out over a network and those messages resulted in the public will to deploy guns and the particular private orders that resulted in deploying and using them.
Re: You still can't transmit food and water over the internet. Ha ha ha. The internet does not carry the goods, but it sure as hell results in their delivery. Go to Amazon and order a book over the internet. The book WILL arrive at your door. I can personally send an email that would result in a bottle of water being delivered in Afghanistan of all places. Guns, bullets, food and water all go where money tells them to go and as far as I know, most money moves via networks already.
Re: Power supply is practical for IP. Yes. And the new Ethernet standard is a practical vehicle for electrical power sufficient to run devices. If a device requires tremendous amounts of current, but you can still send IP packets over the line delivering the current and a device with a TCP/IP stack has a vanishingly small cost then why not monitor and control the device via the same line?
Consider:
  • Device power and device control on separate lines:
    • Power goes down -- no device to control
    • Control goes down -- device out of control
  • Device power and control on same line:
    • Device is either off or on and under control.
Which of the above makes the most sense?
What has not yet happened, but I predict will happen is that TCP/IP and a TAG language will be used to monitor and control just about everything and that just about everything capable of delivering or receiving information will be network attached. Why? Because ultimately it will be the cheapest way to build things. Replace every control and monitoring device on everything with a TCP/IP device that costs pennies or less.
The world becomes very strange once wireless network attached cameras cost less than a dollar and everything it sees can be controlled remotely by a browser.

Wednesday, October 30, 2019

MSYS2 broken install

Ugh. Every year, the non-workingness of released software gets worse.

Problem:

MSYS2 does not quite install properly.

Error Message:

error: mingw32: signature from "Alexey Pavlov (Alexpux) " is unknown trust

User Workaround Fix: 

Run the following commands:

pacman-key --init

pacman-key --populate

pacman -Syu

At the end of the various messages and prompts (choose default Y) you get:
warning: terminate MSYS2 without returning to shell and check for updates again
warning: for example close your terminal window instead of calling exit
When you attempt to close you will get a new warning in a popup window about running processes. Close anyway.

Open a new MSYS shell window and rerun pacman:

pacman -Syu

Wait until your brain explodes or you complete the re-installation/upgrade of packages (in my case 40).

That was it for me. YYMV

Developer Fix: 

Release is broken:

  • Fix Deployment Test to capture error before users see it.
  • Fix Deployment, since it deploys incorrectly.

Test is broken:

  • Fix upstream testing



    Build is broken:







      • Fix Build and upstream code for deployment. 


      Saturday, September 21, 2019

      Privacy: We've heard of it.

      I am a big fan of irony. I find it delicious that Firefox recommends that the way to 'protect your privacy' is to 'Join Firefox', thereby immediately sharing your email address to the world, and agree to a 'privacy policy' which promises to essentially collect every bit of information it possibly can and share it essentially to every corner of the Internet. Step one? Immediately pass your email address on to a 'partner' that will share it with anyone they wish.
      By default, Firefox collects and shares any and all data available from anywhere it can, including "data that identifies you or is otherwise sensitive to you" from an unlimited number of third and fourth parties.
      Data is sent to third parties, each of which has its own privacy policy. Each in turn confesses to acquiring anything they can, storing it, and further sharing it on to fourth parties. Third parties are not limited to, but explicitly include:
      Pocket Recommendations, Adzerk, Google, Adjust, Leanplum, 'search partners', your search provider, SalesForce, Firefox connected services (Lockwise, Monitor, Notes Send, Firefox Screenshots and Sync), and Mozilla.




      Monday, February 4, 2019

      Standard Directory System

      STD -- Standard Directory

      Overview

      The 'Standard Directory System' has been in use in various forms for more than 30 years.The principal aim of the system was to avoid collisions. That is, it was intended to allow files that are different from one another to never overwrite one another. This was accomplished by having conventions for the names that are used for various common programs.As the system has developed, the system is made to organize things so that they are easier to find. This is accomplished by having conventions as above for particular programs and general conventions for particular types of programs and conventions for standardized directory structures. The standard anticipates various types of common requirements and assigns standard names to those requirements.

      Standard Directory Names

      Where possible, conventions that already exist have been followed. In some instances, such as the 'usr' directory under UNIX, it has been decided to deviate from the convention so that collisions are avoided. In other instances, such as the 'home' directory, the convention has been followed to allow easy transition back and forth between the systems.

      General Conventions

      Three letter name sizes

      As a rule, directory names are only three letters long. There are several reasons for this: People generally find 'TLAs' easier to remember. By making most 'standard' names the same size, other directories are easy to spot in a listing Organizing large disks can require very deep structures. Using long names often leads to over-runs in the size of the name. This is particularly true for various target media. For instance, some media limit the total path length to only 63 characters. ISO 9660 CD-ROMs have a maximum path length of 207 characters. By making the standard names only three characters long, we free up more space for other software that does not follow the standards. For instance, here is a real pathname as constructed by other software and its equivalent as a 'standard directory name':
      1. C:\Program Files\Microsoft Small Business\Small Business Accounting Addins\Fixed Asset Manager\Templates
      2. C:\std\app\mcs\msb\sba\fam\tpl
      Example 1 is 104 characters long. Example 2 is only 30. Should there arise an occasion (as there does in the real world) where one would want to copy a backup with the full directory structure into the bottom of the path, it would exceed the length allowable on most CD-ROM's and it would not be possible to copy it there. Worse, many file systems will allow you to write a path that exceeds their limits and then not allow you to read or delete that path.Experience with real world systems where entire structures must be written deep into other directory structures shows that problems arise quickly if very long names are used. Since we can't control the use of long names in other systems, we are doubly obliged to keep our own usage to a minimum.

      Simple Names

      The standard specifies that names should be, as much as possible, formed according to simple 'lowest common denominator' rules. This is in keeping with the principal that we are attempting to avoid 'collisions'. In this case, the collisions are with the conventions of a given operating system. That means: No spaces, use underscores if needed. No characters that conflict with operating system shells -- example: '(', ')', '[', etc. Generally use lower case letters to remain compatible with case-sensitive systems such as HTTP. If practical, there is still a bias toward 8.3 DOS style file names, especially with files that may travel to many operating environments -- example: readme.txt As a general rule, the name should be able to retain its exact structure across all environments where it is likely to reside.

      Some specific names

      Root directory

      The root directory currently in use is 'std'. This is not particularly important and for anyone using the system outside of ones created by us, any three-letter root that does not collide with conventional directories would be fine.Standard Sub-Directory NamesThis is not an exhaustive list and it changes from time to time as standards change.std -- Root Directory Root directory as currently used. This can reside on any drive or in the case of UNIX would hang off the regular root. Example: C:\std\
      • app -- Applications
      • app -- application directories
      • arc -- Archives
      • arc -- Archive Files -- Not quite the same as the backup directory. This directory merely contains archives of files that are taking up space but still may be needed from time to time. It is also a 'scratch area' to allow archiving temporarily to free up disk space.
      • bin -- Executable Files
      • bin -- binary executables and scripts.
      • bkp -- Backup Files
      • bkp -- Backup files and related files. These are intended for actual backups. What type of backup resides here depends upon the location of the directory.
      • cyg -- Cygwin
      • cyg -- Under Windows environments reserved for Cygwin.
      • dvl -- Development
      • dvl -- Software Development directories.
      • doc -- Documents
      • doc -- Documents -- *.doc, *.txt, *.xls, *.ppt, etc.
      • home -- User Home Directories
      • * home -- home directory -- retains UNIX convention for home directory.
      • hme -- home directory -- home is deprecated. By making user homes different from both Windows Logins and UNIX logins, it makes the system more 'OS agnostic'. It also does not suffer from the clutter put in home directories by well-meaning admins. Copying /std/hme/nzt/doc from one system to another should not interfere with anyone else's conventions.There is also the matter of symmetry. Branches should generally be three letters. That seems to work best in practice and it makes directory trees easier to visualize, less tedious to type, etc. Leaves can (and arguably should) take longer names thus creating: /std/hme/rst/dvl/vb6/tst/TestTrackBall would be perfectly valid. The path above breaks down as :
      • std -- Standard root hme -- Home directories rst -- Robert Stephen Trower's home directory. dvl -- development vb6 -- vb6 specific development tst -- test programs TestTrackBall -- Working directory to Test TrackBall code.
      • inc -- Include Directories
      • inc -- include directory -- contains various types of include file. This is generally different from the C-Language standard 'include' directory.
      • lib -- Library Directories
      • lib -- Libraries. This typically contains *.lib or *.o or *.obj files.
      • std -- std 'reflection' directory
      • std -- Under C:\std, for instance, this would be C:\std\std. It is used as a 'reflection' of a standard structure that might, for instance be the subject of a network share.
      • svn -- Subversion
      • svn -- Subversion Revision System
      • tmp -- Temp files
      • tmp -- Temp files. This is especially used by things such as session variable.
      • trn -- Transfer files
      • trn -- Transfer directory. When files are being transferred into or out of the system, this is where they reside during transfer.
      • web -- WWW files
      • web -- Root of things related to a local web server.
      • wrk -- Work/Scratch files
      • wrk -- Many things done on a file system involve things that are experimental, have a limited lifetime or require some thought to create a permanent home. The 'wrk' directory is designed to give a quick area to do work that will not interfere with the rest of the system.

      Standard Path Variable

      The 'standard path' is formed so that the commands available depend upon your context within the file system. For instance, there may be utilities specific to one set of files that do not apply to another. This ensures that when appropriate, the commands are on your path, but does not require your path to include everything on the disk.
      The standard path usually contains something similar to the following:

      Standard Paths

      Absolute Paths

      Personal Command directory (over-rides all) C:\std\home\myname\bin; Global command directory; C:\std\bin; Root command directory \bin;

      Relative Paths

      local command directory bin; Parent command directory ..\bin; grand-parent command directory ..\..\bin; great-grand-parent command directory ..\..\..\bin;The above forms a path similar to this: path=C:\std\home\myname\bin;C:\std\bin;\bin;bin;..\bin;..\..\bin;..\..\..\bin;

      Operating System Paths

      Other paths are required as well, such as: Adjunct OS path (Cygwin) C:\std\cyg\bin; The operating system path: C:\WINDOWS;\C:\WINDOS\SYSTEM32;It may also be convenient for certain things to be usable globally from the command line:Application Paths Application path C:\std\svn\bin;

      Putting things together in the intended order gives a path that looks like this:path=C:\std\home\myname\bin;C:\std\bin;\bin;bin;..\bin;..\..\bin;..\..\..\bin;C:\std\cyg\bin;C:\WINDOWS;\C:\WINDOS\SYSTEM32;C:\std\svn\bin;

      As can be seen above, this is already a substantial path. However, it allows a very wide range of commands to be available while working and leaves room for the many applications that add their own paths during installation. Because many of the commands are on relative paths, the search for a command is much faster than it would be if all application commands were on the path at once. The bin;..\bin;..\..\bin; relative path construct also allows the placement of commands to show a finer granularity and greater specificity to the task at hand.

      By placing everything that pertains to a given system within its own directory structure, backup, removal and restoration of backups is simple and has a minimal impact on the rest of the system.

      Relative paths also allow the simple relocation of directories and also allow the creation of completely operational copies of the directory in other places. It is possible, therefore, to try a radical global change on a group of directories on a copy rather than the original.

      Gun control

      People with guns are definitely killing people. 

      The U.S. second amendment is clearly, should you look at its historical context, meant to guarantee citizens the right to bear arms precisely to resist an over-reaching state. Emphasis mine:

      "In a 5-4 decision, the [United States Supreme] Court, meticulously detailing the history and tradition of the Second Amendment at the time of the Constitutional Convention, proclaimed that the Second Amendment established an individual right for U.S. citizens to possess firearms and struck down the D.C. handgun ban as violative of that right. " --https://www.law.cornell.edu/wex/second_amendment

      The U.S. Constitution is the instrument whereby the Citizens of the United States delegate the partial exercise of *their* sovereign power. That instrument specifically forbids disarming the citizenry. The interpretation that 'militia' somehow puts that power right back in the hands of the state is without merit. 

      Because it is so clear in protecting individual rights, the U.S. constitution acts to some extent like a proxy for what is the sensible intent of a citizenry as a collective. 

      Biotechnology Malfeasance


      Below is a submission that I made to CBAC. This was a cynical process that was dressed up as a public consultation but was in fact a 'pro-forma' exercise to rubber-stamp a decision already made without public input. As it turned out, public consultation showed that the public was near universally opposed to the legislation contemplated and ultimately passed. 

      Comments on the document: Interim Report of the Canadian Biotechnology Advisory Committee to the Biotechnology Ministerial Coordinating Committee

      The original source of the document reviewed was here: http://www.cbac-cccb.ca/documents/GMenglish.pdf. This has since been removed. However, I found what appears to be the same document archive here: http://publications.gc.ca/collections/Collection/C2-589-2001E.pdf


      About me:

      I make my living as a software developer and researcher. I have a Bachelor of Science degree (major in Biology). I have no particular vested interest here, other than as a member of the public. I hold a scientific world-view similar to 'the received view of science'. I am somewhat biased in favor of scientific advancement. I am hardly a 'tree-hugger', or an 'eco-nut'.

      My concern with respect to the matter at hand is grounded in my understanding of the following:

      a)       The opportunistic nature of biological organisms and evolution
      b)       The geometric growth of living populations
      c)       The insular nature of bureaucracies
      d)       How lobby groups influence the public agenda
      e)       Intellectual property issues
      f)        How answers depend on the particular questions asked

      Additionally, I did a little research to look at what others are saying about the various issues involved and looked at some news stories surrounding specific related events.

      General Comments:

      The CBAC document surveys many different aspects of the subject at hand. Some valid points are raised and an effort has been made to make the document at least seem inclusive. However, the document seems lacking in some important ways. I feel the following are flaws that should be addressed in a final draft:

      1)       The potential for catastrophe is not properly treated in this document, even though this is really the heart of the matter. This appears to stem from a misunderstanding of the distinction between biological material and non-biological material. Living things migrate, self-replicate, interbreed and evolve. Current technology can not cure the common cold or safely rid my lawn of weeds. What does this say about our preparedness to deal with a virulent biological threat?
      2)       Important recommendations and feedback regarding caution and safety concerns are buried in the report and appear to be 'soft-pedaled'.
      3)       Trivial concerns and considerations are presented as if they are equal to very important concerns. This is particularly true with respect to safety, sustainability and public welfare.
      4)       Liability issues are ignored entirely.
      5)       "Transparency" is dealt with in an abstract way but it is unclear that the process has been effectively transparent. Cursory research shows that the document in question is not a balanced reflection of the views of all stakeholders.
      6)       The problems inherent in current intellectual property legislation are not adequately dealt with. See the appalling decision by the federal court here: [Note original is no longer on the web, but archive.org has a copy here: https://web.archive.org/web/20011218034107/http://decisions.fct-cf.gc.ca/fct/2001/2001fct256.html] [original no longer available was: http://decisions.fct-cf.gc.ca/fct/2001/2001fct256.html.] I am certain that informed Canadians would object to this outcome. The judge's decision is soundly based on principles of law. However, the legislative framework and precedent surrounding this is bad.
      7)       The uncontrolled spread of biotech material, equipment and expertise might lead to deliberately created bio-weapons. The report does not address the issue at all.
      8)       Specific examples from past experience are missing. There is no mention, for instance, of StarLink. Why not?

      My general impression of the document is that it is heavily influenced by partisans whose aim is to allow GM, GE products to enter our biosphere for profit. This is troubling in the extreme. It is only at this point in time that we will have the opportunity to decide whether or not it makes sense to let the genie out of the bottle. Once it is out, we may not be able to put it back in. The CBAC document ought to be more balanced and forthright in presenting the issues and recommendations.

      Here is what I would like to see:

      1)       Get some real biologists involved. We are talking about living things here. They differ markedly from non-living things. It is clear from the content of the document that there is a profound lack of understanding of the importance of the mechanisms of both biological evolution and ecology. GM foods absolutely do not belong under the umbrella of 'novel foods'.
      2)       Ban FrankenFoods not already approved.
      3)       Mandate proper liability insurance coverage for FrankenFoods in use.
      4)       Mandate proper liability insurance coverage for research facilities.
      5)       Mandate a "Roll-back" plan. Any FrankenFood introduction should have a back-out strategy that completely undoes the introduction. This includes the costs of removing the mutant material from the ecosystem and repairing any damage done. If you can't afford the cost of the insurance, you can't afford to do it.
      6)       Place proper, conservative legislation around FrankenFoods in use.
      7)       Fund research to assess risks properly.
      8)       Legislate safety standards for research such that material used by and generated by research activities can not escape into the environment. This should include controls on access to advanced equipment and materials that could be used to develop biological weaponry.
      9)       Fund an advocacy group to persuade other entities (nations, multinationals) to refrain from doing anything that will place their GM, GE products from crossing our borders.
      10)   Legislate severe penalties for vested interest parties found to subvert the process of this public debate. This should include jail time, disgorgement and punitive damages. "Whistle blower" legislation would be good and perhaps bounties to ensure that there is some incentive in place to find and prosecute those who would injure the public good.

      Specific comments regarding document content:

      Re: 8.1 "The Panel recommends the precautionary regulatory assumption that, in general, new technologies should not be presumed safe unless there is a reliable scientific basis for considering them safe. The Panel rejects the use of “substantial equivalence” as a decision threshold to exempt new GM products from rigorous safety assessments on the basis of superficial similarities because such a regulatory procedure is not a precautionary assignment of the burden of proof".

      The quote above appears on page 63. It should be right up front, stated more plainly and should be the central thrust of the report. I believe that these technologies should be assumed unsafe and banned initially.

      Re: Section 5.4: "This does not imply, however, a zero-risk approach. … Under circumstances where it is appropriate to use substantial equivalence as a framework to structure the safety assessment of novel foods, it is necessary to ascertain whether the composition of the plant has been changed in any way. "

      When we are talking about potential catastrophic failure of our environment, our food supply and life itself, I would say that a 'zero-risk' strategy is the only acceptable standard. Is there a reputable biologist not co-opted by the bio-tech industry that accepts 'substantial equivalence' as a meaningful concept? I do not accept the 'substantial equivalence' concept at all.

      There is no discussion of how liability will be dealt with. Say a particular FrankenFood vector wipes out a vital food species, kills scores of people, is responsible for genetic infirmities in people or livestock, destroys eco-systems, etc, etc. Who pays? Should we not work to establish this? I would like to see a comprehensive liability insurance scheme in place. It should be funded by all FrankenFood beneficiaries. It should absolutely cover any and all liabilities arising from the introduction of FrankenFoods into our ecosystems.

      Re: "The sessions were designed to achieve a balance of representation from the general public, society, industry, research and academia. However, some representatives from civil society, primarily environmental nongovernmental organizations, chose not to participate, thus diminishing the representation of this group. "

      This discussion, as framed by CBAC, has been similar in kind to a discussion about a regulatory framework that assumes the exploitation of children. You start with the assumption that something unthinkable will be done. We are now talking about the mechanics of legislation. It is fairly clear from your document that you do not have the benefit of the following point of view: GM, GE foods could be exceedingly dangerous on a scale that makes their introduction unthinkable at this point in time.

      It is troubling that we are blithely discussing a 'regulatory framework' for FrankenFoods. Due to the actions of large integrated companies in the food industry and regulatory failures of other political jurisdictions, we may be obliged to accept FrankenFoods. However, this should be specifically addressed as such ("this is why we are forced to…") and strategies to combat this should be examined.

      Re: "The petition presented by these representatives, as well as CBAC’s response, can be found on the CBAC Web site".

      I could not find it there. In fact, it took a while to track it down here:


      I was troubled by the fact that the response seemed to be tangential to the questions raised and appeared to be carefully avoiding the substance of the petition itself. Even though I had not read the original petition, a careful reading of the response led to the conclusion that the response was meant only to deflect the petition. The response ignored the substance of the petition. Here is where I found the petition itself.


      It is chilling to read both the petition and the government response. It is equally chilling that such a substantive treatment of the issues CBAC should be addressing is dealt with in such an off-hand way in the CBAC document. It is curious that you say 'can be found on the CBAC Web site', but do not offer the location in your document.

      "CBAC will take into account the feedback from Phase 3 to produce its final report and formal recommendations on the regulation of GM foods."

      What does the above actually mean? My cursory inspection of available sources on the internet show that for the most part observers who do not have a vested interest in bio-technology universally disagree with the headlong rush to embrace GM foods. It would be difficult to get that message from a reading of the current report as proposed by CBAC. CBAC seems to already be in the 'pro bio-tech' camp. To the extent that arguments respecting caution are acknowledged, the posture of the document tends toward condescension and the 'handling' of objections.

      "This report will be delivered to government in early 2002".

      This report and any legislation pertaining to GM foods should definitely be debated in the House of Commons. Hopefully the final report will be a little less cynical, contain a little more substance and address the unique biological issues involved with GM foods.

      The reports referral to the 'novel food' concept is troubling. It conveys a lack of understanding of the very unique specific dangers presented by GM foods. These dangers transcend the dangers of an otherwise 'novel' food. I do not worry that a novel source of fiber in the food supply will go on to destroy the ecosystem, create a new virus or pest, end up being incorporated into the genome of my children, etc. The ultimate danger of GM foods is to collapse the ecosystems upon which life itself depends. To blithely insist that GM foods are comfortably on a peer basis with other novel foods is to miss the point entirely. Whatever the likelihood of risks involved with GM foods, those risks are potentially catastrophic. A novel packaging that subsequently turns out to poison, kill and maim might get a couple million of us at worst. GM foods could potentially kill us all.

      Re: "Scientists developing products of biotechnology do their work in labs, growth chambers and/or greenhouses. In these settings, the products are contained and should not come in contact with the environment. These activities are not currently regulated under the federal system. The Canadian Institutes of Health Research have guidelines for working with genetically modified organisms. Most research institutions — both public and private — also have their own codes of conduct and oversight committees for biotechnology research."

      Right. We need regulations that clearly enforce the requirement that GM material does not escape. Further, we need to address liabilities and put in place the financial infrastructure to deal with any breaches that impact human welfare.

      Re: "This means that plants produced through biotechnology are grown under conditions aimed at preventing the transfer of pollen to other plants; they are monitored by the experimenter and CFIA field inspection staff; and the trial site is subject to post-harvest, land-use restrictions and further monitoring."

      Have these methods not failed to achieve their purpose in the past? Why does your report not specifically address these instances and give an analysis of the dangers posed? Current techniques of sequestering GM plants and the legislation governing this are both wholly inadequate in my opinion.

      Re: "The question of commercial secrecy also arises in the debate over the government’s transparency. The desire of companies to maintain the confidentiality of data or information that they consider “commercially sensitive” has some impact on the degree of detail the regulator can provide in communicating information to the public. It also raises the question of who should determine what is commercially sensitive information".

      If commercial interests make a pact with the Devil to arrive at their FrankenFood, we need to know. Absolute disclosure should be mandatory with proper peer review and the opportunity for interested parties to comment intelligently. Informed consent is an absolute must. How can we give informed consent to something about which we know only partial details?

      The Network Endgame


      I believe that Facebook remains a significant threat to not only Google, but most of the other players in cyberspace as well. In fact, right now, Facebook is *the* significant threat. If I were in charge of Google I would be working furiously to contain this threat while there is still some chance. We may be past that point already. In the absence of any real competition and barring any serious misstep on their part, Facebook will swallow its competition on all sides.

      There will be one dominant player in cyberspace and it will be a social network. Right now, Facebook is the one to beat. It reached critical mass at least around the 500 MAU mark or so. This is an inevitable outcome of the mathematics of 'Group Forming Networks'. The compounded force of the interlocking networks makes Facebook extraordinarily sticky. In fact, mathematically, the acceleration toward the center of such a network exceeds that of gravity.

      For Facebook, the battle has been all but won and it is now in the endgame.

      Amongst other things, Facebook has:

      - relationships with billions of people
      - data on most of the active users in cyberspace
      - a captive audience
      - much more of the world's end-user data (photos, etc) than anyone else
      - an extremely 'sticky' and interconnected network
      - active support for 'group forming networks' (GFNs)
      - the largest (by far) network of GFNs
      - committed users
      - very deep data on their users

      The above assets are near impossible for another company to duplicate and as long as they are not disrupted somehow, Facebook will continue to accrete 'social mass' and increase its network 'density'. The ease of disruption is inversely proportional to the nth power of the number (n) of people in the network. At over 2 billion people, that value (1/(2^n) has become very, very small. It is so small that a network attempting to compete 'apples to apples' has virtually no chance of succeeding. To disrupt this network now will require seriously novel thinking.

      As a particular example, the way to monetize page-views (and hence search, blogging, tweets, online content, etc) is through advertising. Google must cast a wide advertising net to facilitate a sale. Advertising on Facebook can be targeted much more effectively. Ultimately, the source of funding for advertising comes from sales. As long as a dollar in advertising results in more than a dollar in net revenue for the advertiser, the dollars will flow. Google cannot guarantee that your advertising dollars will create net profits. Because Facebook knows much better who will buy, when and why, they can already offer a much greater bang for the buck. Within a few years (probably sooner), they will likely be able to demonstrate that their advertising services increase net revenue -- the net profit associated with the advertising will exceed the cost of advertising, making advertising on Facebook a necessity.

      Google still has search, so they will still have eyeballs. However, as more of cyberspace falls into Facebook, Google can provide search for less and less of the internet. Search is only going to draw people who are looking for something. If they already know where it is (on Facebook), they will not visit Google. This is especially true as more relevant content moves into Facebook and effectively out of the reach of Google.

      Google has a lot of data on who searches for what, when and likely why. They also have a lot of data on volumes of search requests. No doubt they have some methods of determining how much of their audience is being captured and held by Facebook. The fact that Google, the company with the most intelligence on cyberspace prior to Facebook is worried means they must have seen something that worries them.

      Any company that is threatened by Facebook (and I think that is most of the players in cyberspace) needs to first of all understand *why* Facebook presents such an extraordinary threat. It is not clear to me that even the people at Facebook realize why they have grown so spectacularly.

      The best hope that competitors have is that Facebook stumbles badly and creates pressure for their user base to move. Failing either a dramatic challenge from a huge player like Google, Amazon, Microsoft, Oracle, HP, IBM, etc, government interference or a spectacular stumble by Facebook, the game has already gone to Facebook.

      Excepting government interference, I think that Google, Microsoft and Apple have the best chance to disrupt Facebook. However, their window is rapidly closing. People and entities like companies, charities, clubs, etc. that already have data, a comfortable presence and network linkages on Facebook have no incentive to move away from Facebook. They need to be enticed away by something more compelling. As their presence on Facebook grows and is increasingly bound by linkages into the Facebook ecosystem, the ability to move them diminishes rapidly.

      Facebook vs Google


      Facebook presents unique challenges to Google. In my opinion the game has gone to Facebook. The only thing that can stop it is some combination of:

      1) Government interference
      2) Aggressive action by big players like Google, Apple, Microsoft, Amazon, etc.
      3) Big missteps by Facebook management
      4) Dramatic sabotage
      5) An aggressive, disruptive competitor

      Many appear to underestimate the threat presented by Facebook.

      People are motivated to 'act' when they use a search engine looking to buy something. They will be more receptive to specific relevant advertising. However, Facebook is increasingly in a better position to know who will buy and when. Search engines have a query string and perhaps a little history. As they refine the system, Facebook will have much more. They will only put ads where they will be most effective. An advertisement on Facebook will ultimately anticipate the search before it even happens. You will eventually not be able to present your ad because no search will take place. Both the search query and the data it is looking for will be on Facebook and you will be out of the loop.

      In the Facebook universe, I do not have to turn to a search engine to find out what folks around me are doing. It is scrolling by when I log in. As the Facebook relationship grows, they will anticipate more and more of what I might be interested in seeing and they will present it to me before I ask for it. Google is an excellent search engine, but it takes at least a dozen keystrokes or more to do any non-trivial search. It also requires me to shift attention away from what I am doing. On small-screen devices, it means I have to essentially switch applications to move off to do a search. As Facebook matures, it has the following advantages:

      1) It has the user's attention already. If you are not Facebook, you have to get the user's attention first.

      2) It has rich context beyond just a query string. It knows the user's habits and recent past, it knows this for friends, family, cohorts, demographic groups, etc. It gets real-time updates to this, so it 'knows what is going on'. [Network forces are cumulative. Here is one particular advantage this deep context confers on search: it can 'auto-up-vote and down-vote' pages. The user is more likely to see the results they want. If everyone in your social circle is looking for that dance video, Facebook can anticipate the search request and place the link on the page before it is even requested.] Given time, Facebook will know where you are, who you are with and how long you are staying. A search engine might use geocoding to guess where you are and use 'cheats' like persistent cookies to steal data otherwise.

      3) It can access the full universe of data inside and outside of Facebook. Search engines can only access data visible outside. If I am looking for that picture of my friend's cat it is not even in the search engine index.

      4) It already has relationships with buyers, sellers, intermediaries and other interested parties. Its relationships are persistent and interconnected providing much more intelligence.

      5) It knows the user and has enough information to determine 'worthiness' (credit, health, credentials, social status, etc)

      6) It can persist unlimited data specific to users, buyers, sellers and their relationships.

      7) Killer Facebook feature: mediated promiscuous data. There are ways to provide everything a vendor needs to offer a good quote without compromising user privacy in any way. Users can store private information without worries that they lose control.

      8) Function is embedded in a 'live' application. A search in Facebook is a request to a framework, not a search engine.

      9) The user has a 'vested relationship' with Facebook. Gaining access to private personal and financial information, contact information, etc is easier for Facebook because they already have a relationship capable of trust and they have ongoing storage related to the user.

      10) Facebook already has experience and expertise with infrastructure at scale. This can allow very rapid expansion of capabilities that can entirely take competitors by surprise. They can consolidate before competitors can respond.  

      There is no doubt in my mind that Facebook is investigating what they need to do to compete in all the major categories. If they are not, they soon will be.

      Right now, Facebook has a nearly insuperable advantage. They have the user's attention. Right now, it is possible to directly monetize that attention through advertising and they are doing that. However, the really big score is the deepening commitment of Facebook users. As these users continue to accumulate data on Facebook, their relationship with Facebook deepens in ways competitors simply cannot match.

      Network effects have taken hold at Facebook and since that is the current center of gravity the positive feedback loop is irresistible. In fact, I believe that the only thing preventing gravitational collapse right now is Facebook itself.

      I think a search company has a good chance of unseating Facebook. However, it needs to focus on the correct goal. Once Facebook institutes a usable search on their site it will be extremely difficult to compete with them in the search space.

      Here are a few 'quick hits' -- ideas that I would have in mind if I were doing a search engine competitor to Facebook:

      1) Providing a high-quality search is key. Bean counters will move in and suggest how you might 'improve' the business. Resist them. For now, the business is Search and you know more than they do.

      2) Users are your partners. Keep faith with them. Going forward, you should look for ways to share your success with your users. This may seem strange, but users create equity and ultimately they can snatch it away. Keep them from taking that equity away by sharing with them fairly. One idea that might have merit is to allow them to participate in some preferential way in the actual market equity of the company. Perhaps users could accumulate rights in something akin to warrants to buy shares in the company. When I say 'fairly', I mean that you should, for instance, protect their privacy.

      3) Lean really hard on getting into user's browsers as the default search engine. This is an instant win and if you are at least as good as the other search engines, you almost win the battle (not the war) right there.

      4) Realize that social networking is the ultimate killer app. Search is your entre, but providing users with a permanent home should be your goal. Users will go back to the location where they find their friends and family and their notes and pictures and messaging system, etc. The dominant social networking player will have all the time in the world to perfect search, applications like mail, online sales, news feeds, etc. If you compete in a 'niche' like search, you must be the best and stay that way. If you are the dominant social network, you just have to 'be'.

      5) Your true competition, Facebook, is still vulnerable. You can beat them, but not for long. It could render itself invulnerable within less than two years. While that window is still open, a killer search company is the best bet for replacing them. It is not an option to just stick to your search niche, though. Eventually, the dominant social network will take that over. You need to gain traction with your niche, but make no mistake, they will definitely close the search niche.

      It may be that the game is already over and Facebook has won. If that is the case, you should approach them as soon as you can to sell them the company or at least license the technology. Were you to join forces with Facebook today it would probably be a 'Google killer' and Facebook might well be willing to share significant equity to dispose of a potentially powerful rival like Google.

      There are many nuances in all of this. For instance, keeping faith with users may be one way to steer clear of government interference.


      QR Code Generator

      Below you can generate a QR Code for a site URL that can be used by a smartphone camera to visit the site.  URL QR Code Generator ...