Monthly Archives: December 2012

Polyethyleen in Nivea Pure Impact!

December 29, 2012

http://i0.wp.com/khoobsurati.com/media/catalog/product/n/i/nivea_for_men_pure_impact_shower_gel.jpg?resize=175%2C175

Nou heb ik toch per ongeluk een nivea douche gel bij de Appie gekocht met polyethyleen erin… grrr.. (zie mijn post)

Ik had een beetje haast en had ook niet een lijst als deze bij me… dus ik had ook niet de micro particles checklist bij me.

Zou ik hem nog terug kunnen brengen? (waarom verkoopt de Appie uberhaupt deze troep?) (of Wehkamp of Bol.com …)

facebook: https://www.facebook.com/Stichting.De.Noordzee

(p.s. Unilever gaat dit spul al wereldwijd uit hun producten halen, nu hopelijk  Beiersdorf ook snel)

Eeuwig leven door Carnosine?

December 16, 2012

vraag: Kan ik eeuwig leven door dagelijks Carnosine in te nemen?

achtergrond: we verouderen doordat onze cellen zich maar tussen de 40 en 60 maal kunnen delen (de Hayflick limiet) , na die limiet begint de cellular senescence en ga je dood. Die limiet bestaat omdat bij elke deling de telomeres van het DNA in de cellen iets korter wordt (een regio aan elk uiteinde van een chromosoom). We kunnen dan niet meer repliceren en gaan dus dood.

If telomeres become too short, they have the potential to unfold from their presumed closed structure. The cell may detect this uncapping as DNA damage and then either stop growing, enter cellular old age (senescence), or begin programmed cell self-destruction (apoptosis) depending on the cell’s genetic background (p53 status).

Het probleem is dus: die telomeres worden te kort, daarom worden we oud en gaan we dood.

Ongetwjfeld… zullen we iets bedenken, ik schat tussen nu en 100 jaar waardoor dit probleem opgelost wordt. Op zich… is het niet zo’n heel moeilijk probleem. Het gaat er simpelweg om, om die lengte van de telomeres van het DNA in onze cellen te behouden.

Ik wil het overleven tot we iets ontdekt hebben wat telomores lengte zal behouden en dus eeuwig cel regeneratie zal toestaan en dus eeuwig leven zal betekenen. Het zou extreem lullig zijn als ik behoor tot de laatste generatie die nog dood gaat.

Daarom … las ik over Carnosine wat bewezen de Hayflick limiet kan verlengen.

Carnosine zit in rood vlees maar het probleem daarbij is dat de concentratie te laag is om behouden te blijven in het lichaam. Je moet dus hogere concentraties nemen.

Mooi toch: eet mijn vlees, drink mijn bloed en je zult eeuwig leven?

Nu… waar bestel ik een container met flesjes Carnosine?

ter illustratie: links cellen van een jong iemand (A), onder (B) cellen van een oude mens en rechts ( C ) cellen van een oud mens die Carnosine krijgt toegevoegd…

image

nog een illustratie: 2 muizen, zelfde leeftijd, links een muis met Carnosine en rechts een muis zonder Carnosine:

image

(plaatjes via http://www.autismcoach.com/l_carnosine_autism_p/ac-004.htm)

update: een flesje van 48 theelepels kost $ 38.  Pff…  Aan de andere kant: als ik met een flesje 1 week kan doen kost het evenveel als dat ik nu wegrook … Ik ga er nog iets meer over lezen, ik weet eigenlijk niet hoeveel ik dagelijks moet innemen (of anderzinds toedienen) om het beoogde effect te bereiken. Zou Beta-Alanine werken (5g/dag) ? Ik zal nog iets meer gaan lezen hierover.

update 2:  ik vond : TA65 Nederland: http://www.ta-65.nl/index.htm interessante site nog niet doorgenomen, ik weet niet wat de relatie is met Carnosine.

Link

Some bookmarks/link I thought I’d drop here:

 

Complexity Level versus usage of Complexity Reduction or Management tools and processes

December 13, 2012

What I am thinking about is if we can define metrics on on the one hand to calculate complexity and on the other hand patterns that arise when complexity becomes more complex. We could use this to more effectively use the right tools and processes on the right level.

If you have no clue of what I am talking about … I have the problem that probably you will not have a clue even when I explain it…. but I will try (briefly):

If you have 10 stamps you would care less about a system to manage stamps and even less about a process of ordening stamps but you are interested in their names e.g. “10 cent christmas stamp”, it’s what children do: they slowly begin to recognize names with objects: by doing so they start managing the complexity around them.

If you get to 10.000 stamps you slowly begin to feel that you: 1.  want to orden them, e.g. by country or by value or by “pretty and not pretty” this is more or less what science does : this is of the family of apes and this is of the family of trees, these are called taxonomies. 2. you would like to have e.g. spreadsheet to make a list of them so you can go to a stamp collectors market and trade based on this list. In this list you would define properties of the stamp alongside with the name, this is what database does: being a structured collection of data with the emphasis on structured and what that means. You don’t need a process you just print out your list and go to these collector markets.

If you get to 1.000.000 stamps in your collection you definitely feel the need for more taxonomies (e.g. “tags”, “categories”), a complexer database and many more ways to handle this complexity. You also find the urge to slowly introduce some processes and stamp rules: stamps of high value will go in the high value album so you not mistakenly trade them etc… etc… You also will have the need of albums which give indications of default ordenings of stamps and catalogues which give the average values and so on. A finite list of possbilities to manage complexity. (I would like to make such a list when I have time, it would be needed for the solution of all of this)

Now… suppose you have developed a system or a novice process (catalogue, description, templates for entering stamp information or sell stamp forms, people) to manage large collections of stamps. Would you try to sell these to persons with only 10 stamps? Probably not.

Now… suppose you own a stamp company that has 99.000 departments in 200 countries. Managing this is uber-complex. You feel the need for standaard tools, databases, processes and so on. But for a specific department in one specific country there could be just 1 person managing 10 stamps. This person could care less about a system managing stamps and he or she is probably right because it would take more time to work with this system or process than simply handle the 10 stamps.

Now… IF that stamp company would pre-scribe 1 system to manage stamps the smaller departments would not benefit from it since their compleity level is not high enough. But… IF a stamp collection system would exist that would “scale” tools and processes depending on the complexity of that department … that would work.

So that is what I am thinking about: a scalable system that, depending on complexity, introduces or scales complexity reduction tools and processs automatically. That “automatically” means no human intervention would be needed since the system would know that with a metric of 5 stamps no database is needed and with 10.000 stamps it would automatically introuduce new “help” since the break even point between “going crazy of too many stamps” and “learning and registering them in a new tool” would be ideal.

Another example is development:

If you are a single person out there creating your own website with WordPress … you could care less about version management. You don’t even have a clue what it  is.

If you have 25 websites that you work on in parallel you read about version management and e.g. install subversion or git or mercurial to at least have components of often used chunks of code that you can name e.g. “jquery gallery version 0.1″ and “jquery gallery verison 0.2″ that you use in different sites at the same time (version management tools give you the benefit that you can name each iteration of your code). At that stage you could care less about complexer management using e.g. configuration management tools and processes. you probably will work with “todo lists” in a shared spreadsheet or something alike. Just because the complexity level does not yet drive you crazy enough. “you can handle it”.

If you have 100 large websites and you work on them with 100 people in bursts of improvement… you slowly go crazy because people are doing the same stuff on the same code and you feel the need to more componentize stuff, you might even feel the need to somehow handle the environment you develop it on and the environment you deploy to, so you slowly move to configuration management and configuration management tooling e.g. ClearCase: being able to “play” with bigger chunks of versions that you combine e.g. “on production server 23″ and also under development on development server 2″. You start using some kind of system that can handle “what everyone is doing”: a change managment system. It’s just because the complexity is there … so you are forced to use “something” that helps you reduce or at least manage that complexity.

At a certain stage e.g. working with 500 people on many different parts of code you even would hire “people who manage that complexity” e.g. configuration managers or change managers or requirement managers whose job it basically is to think about ways to reduce complexity or at least make it manageable. A good configuration manager has experience on how to reduce complexity. A really hard job (and ofcourse not understandable why it would needed if you work in a less complexer environment or are just 1 node in that group of 500 people versus being the manager of those 500 people).

So… what I am thinking about is … if some kind of system could be made that calculates complexity and based on that automatically “surrounds” people with the appropriate level of tools to manage complexity that would be … nice.

The example above are just specific examples of tools or processes around stamps or “IT” what Im thinking of is a bigger approach, start here: http://en.wikipedia.org/wiki/Ontology. And read: http://en.wikipedia.org/wiki/Upper_ontology and maybe the chapter “Arguments for the infeasibility of an upper ontology” and “Arguments for the feasibility of an upper ontology“. Because in the end, it would be needed to create something like this on a foundational layer as one of the chunks “in there”.

In the IT world some ready made ontologies are available so that the persons responsible for thinking of things to manage complexity do not have to start off from scratch. You can think of all kinds of frameworks ranging from ITIL to RUP that encompass assets and processes and handy ready made templates to “measurement of maturity” as CMMI, a gazillion amount of tools that integrate or integrate not with these processes and so on. And then on a meta level methods and tools to combine these frameworks and produce new ones. Actually … too much to descibe in a quick blog post.

So… I think… there is a little puzzle piece missing here. It’s the “component” that scales the meta solution more or less automagically based on complexity level.

And I do realize that metrics for complexity are dependent on many factors including the specific domain.

Noone will ever read this so I´m safe :) So… let me copy and paste something to combine in this to combine later with upper ontologies and meta modeling and “the scale on when what is needed for the people in the specific case based on the complexity surrounding them” :

Process philosophy (or ontology of becoming) identifies metaphysical reality with change and development. Since the time of Plato and Aristotle, philosophers have posited true reality as “timeless”, based on permanent substances, whilst processes are denied or subordinated to timeless substances. If Socrates changes, becoming sick, Socrates is still the same (the substance of Socrates being the same), and change (his sickness) only glides over his substance: change is accidental, whereas the substance is essential. Therefore, classic ontology denies any full reality to change, which is conceived as only accidental and not essential. This classical ontology is what made knowledge and a theory of knowledge possible, as it was thought that a science of something in becoming was an impossible feat to achieve.[1]

In opposition to the classical model of change as purely accidental and illusory (as by Aristotle), process philosophy regards change as the cornerstone of reality–the cornerstone of the Being thought as Becoming. Modern philosophers who appeal to process rather than substance include Heidegger, Charles Peirce, Alfred North Whitehead, Robert M. Pirsig, Charles Hartshorne, Arran Gare and Nicholas Rescher. In physics Ilya Prigogine[2] distinguishes between the “physics of being” and the “physics of becoming”. Process philosophy covers not just scientific intuitions and experiences, but can be used as a conceptual bridge to facilitate discussions among religion, philosophy, and science

Will I go crazy thinking about this?

A single developer who starts using a version management system is doing this to manage complexity. His concern is “his code” and he started doing this because he started thinking about the complexity of his parallel used codebases. a project manager who starts to chunk up work in a breakdown structure is doing exactly the same. We can scale this larger and larger and larger and larger and at the end we end up with branches of philosophy. whose concern it is to manage complexity of “everything” via a structured approach. On this highest level the  reason to do this is to “explain the world” and practically it gave us all sciences and probably everything you around you. So if all this can be made scalable it has to work on the top level and the lowest level and all the levels in between and depending on the level the complexity management and reduction meta level would scale along. Basically everyone human is doing this stuff. In your household you manage “budget” by making a taxonomy of companies and organizations you have a financial relation with and quantify this by adding “fields” such as amount of money that transferred between you and the company. (You manage “likes” on facebook by giving this same taxonomy of companies and organization a “like” or “no like”). If you run a smaller project you will use simpler tools and processes to manage the complexity. If you are responsible of multiple smaller projects that use these simpler tools and processes you will use tools and processes that are available to manage even greater complexity etc… but the “meta model” behind ALL of these tools, processes scaling up would be needed to make such a component.

Maybe a key lies in the philosophy of science, but I have only started reading books about it.

Linkedin is missing "Family"

December 5, 2012

For the future company (either Apple, Google, Microsoft, IBM or Oracle) to buy Linkedin and MyHeritage-Geni and Facebook and Schoolbank and thousands of other social networks…

… it would be handy if Linkedin would include “Family” in the “How do you know …” box, it would save some trouble with the future data conversion project from Linkedin to the generic all-people-of-the-world-who-ever-lived-and-their-relations -datawarehouse. Otherwise we will have to “match” based on only single contact properties and that will not be handy.

 

image

BMW 320d EDE Touring tegen 20% bijtelling is uit

December 4, 2012

image

Ik rijd nu de “oude” BMW  Touring (geboren in 2012 net voor de nieuwe modellen). Ik had in mijn gedachtes dat als BMW een efficient dynamics editie zou uitbrengen van de Touring tegen 20% bijtelling dat dit wel heel interessant zou zijn. De BMW dealer meldde me echter vorig jaar dat deze zeker niet zou verschijnen. Jammer.

Maar… krijg ik zojuist toch de mail:

Geachte heer De Leau,
BMW introduceert de BMW 320d EfficientDynamics Edition Touring. Met een gemiddeld verbruik van 1 op 23 en 112 gram CO2 per kilometer heeft deze 3 Serie Touring toegang tot de categorie van 20% bijtelling. De grootste bagageruimte in zijn klasse en een elektrisch bedienbare achterklep met separaat te openen achterruit zijn standaard. De 320d EDE Touring is begin januari aanstaande te bestellen en leverbaar vanaf maart 2013.
Evenals bij de 320d EDE Sedan zijn de innovatieve EfficientDynamics oplossingen van BMW de sleutel voor 20% bijtelling. Hierdoor kunnen vanaf maart zowel de Sedan als de Touring modellen met automaat ook uitgerust worden met een trekhaak. Want een geraffineerde combinatie van BMW TwinPower Turbo met directe Common Rail inspuiting maken voor dit segment uniek lage emissiewaarden mogelijk – zonder ook maar iets aan rijplezier in te leveren.

Dit model heeft met een verbruik van 4,1 l/100 km en 112 g CO2/km geen concurrentie.

Lees meer: http://www.bmw.nl/nl/nl/general/webspecials/20-procent-bijtelling-bij-bmw/320d-touring-efficient-dynamics-edition.html

(prijs is nog niet bekend: http://www.bmw.nl/nl/nl/general/downloads/_narrowband/pdf/BMW_3_Serie_Sedan_Touring_prijslijst_10_2012.pdf )

Windows 8 upgrade voor weinig geld

December 4, 2012

imageIk was er nog niet aan toegekomen  maar gisteren viel het uit mijn todo lijst … ik downloadde gisteren de Windows 8 upgrade assistant en… 30 euro voor de Vista Laptop van mijn vrouw en evenzozeer iets voor mijn eigen Toshiba Windows 7 laptop. Cheap! Voor dat bedrag… gelijk gedaan.

Bij de Vista laptop van mijn vrouw liep alles zo gemakkelijk dat ik bijna met nostalgie terug dacht aan de tijd  dat ik nog met een CD’tje in de hand, via allerlei wegen PC’s upgrade. Ik had allerlei horror scenario’s in mijn hoofd.

Nu is het : start Windows 8 upgrade assistant, next, next, betaal 30 piek via paypal ofcreditcard, “u kunt blijven werken terwijl windows 8 installeert”, pom ti dom, en … we runnen op Windows 8 met het nieuwe apps scherm als startup.

Wow ervaring, probeer het als je nog niet over bent op Windows 8

Zie: http://en.wikipedia.org/wiki/Features_new_to_Windows_8

Zie: http://en.wikipedia.org/wiki/Windows_8

Dat valt mooi samen met mijn Xbox Music abbo :)

Ander prive Microsoft nieuws :  ik zag zojuist  dat de RTL XL (uitzending gemist) app ook op mijn Xbox 360 was verschenen: ook handig.

Hmmm… die Lumia 920 (2 stuks dan) wordt nu wel voorstelbaar…

12