среда, 21. децембар 2016.

Elementary introduction to our brain



Brain - based morphology and biological design of cognitive logic functions


How is a brain cognitive logic engine that drives the Homo sapiens and who was responsible for all of our evolutionary and technological developments it is necessary in terms of better use of the resources at our disposal to get acquainted with the basics of its morphology to functional level, and to recognize the functionality derived from the same. This is essential to both for the development of management techniques and the development of artificial intelligence, which attempts to replicate the way our brain works.
Evolutionarily speaking, the brain has only two predefined functions:
·         Survival and functions associated with it (90% of the time we live in this mode). This mode is also called the local and linear logic, resulting in the savannas of Africa - as defined by Peter Diamandis
·         Learning and functions associated with it (only 10% of the time we live in this mental mode)
There are several theories about how Homo sapiens reached a new level of evolution that defines modern humans and they can be reduced to the following several theories based on biological changes and / or conditions of life in savannas of Africa where modern man evolved as a species:
·         Loss of functionality of the limbs (monkeys as our closest animal relatives, has four hands whereas due to changes caused by erecting (much higher load on a handful rear legs caused the development of the foot), while men have only two hands.
·         Upright usually theory based on the theory of picking fruit has led to changes in sexuality and therefore the social behavior of the group.
·         Running, as a phenomenon related to escape from predators who live in the savannah, which can not climb trees require the troubleshooting hyperventilation, and our ancestors completely lost fur or body hair disappeared.
·         Increase of sexuality - for the first time in the development of primates require the consent of the females (the phenomenon psychosexual manipulation), for the first time sexuality is based on an emotional rather than a fragrant level, and male birds do not wait for the moment of fertility of females to try to reproduce.
·         Development of fist reached its evolutionary new qualitative and quantitative level, which has caused that it became part of the body that is able to create a tool, and thus civilization.
·         Changes in eating habits, more animal food by the human stomach could not digest so the food began to be digested before it reaches the digestive tract - the emergence of cooked food, which led to a lot of phenomena that our ancestors entered much more energy in the body and therefore more energy needed to supply the brain with energy so there was a change in the way the brain works and its further development.
·         The disappearance of fragrant part of the brain that separates us as a species of animal and its replacement by emotional parts of the brain, the emergence of emotional beings.
·         Homo Sapiens got molars which extended array of foods that could be used in food
·         Long time time to achieve independence and offspring full maturity caused the social orientation of the groups, tribes, clans, and therefore the division of labor and responsibilities
·         The use of tools and the creation of new tools (irrespective of whether they were initially developed Homo Sapiens or Neanderthal from which they were stole by our kind).
·         Mixing with Neanderthals which contributed to the transmission of the disease to this group and their disappearance from the evolutionary scene, ie, the release of all natural resources by only one remaining evolutionary winner – a human.
·         • Female brains of Homo sapiens (perhaps it was the same with Homo erectus and Neanderthal) is programmed so that it deletes the men's success, and since the species is not strictly related to the process of mating as in other animals (other than chimpanzees, gorillas and orangutans) has already entered and the emotional element, resulting in the need to buck again proves which is the basis for the progress of the species.
All these circumstances have led to further changes and evolution of the brain grew by filling out a frontal cavity, and then forcing its frontal part to deform and fold to a greater number of nerve cells, respectively, which increased the mass of brain cells fit in the same space . Right here lies the reason why so many distinguish not only from our ancestors (in terms of mental capacity and abilities), but also our closest relatives - if developed frontal cortex of the brain in humans and in apes man would be 4 times larger area, and precisely in that part of the brain that was last created and in which are stored all our cognitive abilities. Right here lies the answer to the question whether it is possible further evolutionary development of the brain as our most important organs. It is unlikely in the short evolutionary period, because there is not enough free space for further growth of the masses, it is likely scenario gradual evolutionary development that requires a much longer period, because the processes of evolutionary development rather slow when viewed in terms of the lifetime of a human being. The brain is like a body evolved all the time in the existence of Homo sapiens. Today's brain is at its mass and volume two times greater than that possessed by our ancestors. The brain uses about 20% of the total energy of the body and through his work developing energy substantiated that the amount of matter the size of a light bulb lit.
Neuroscience divided to the brain hemispheres with the following features:
·         The left hemisphere of the brain is responsive to speech and logic
·         The right hemisphere of the brain is responsible for the facility location, time, situational orientation and personal decisions. It should also be noted that the right hemisphere of the brain gathers information, but can not verbalize them (can not speak), this accumulated knowledge can turn into something else (these processes of knowledge acquisition in the right hemisphere are particularly important until the child is not the first time speak because the knowledge of the world that the child received in their non-verbal age remain guarded throughout life and are the basis for intellectual activity in adults)
·         Each hemisphere is responsible for motor skills diagonally positioned limbs (right hemispheres - the left arm and a leg)
For all other unconscious, especially vegetative processes that are constantly taking place in the brain is responsible hypothalamus that regulates them. He is the only fully developed part of the brain when a child is born. In it, there are reflex emotions that babies have. All others babies learn through audio-visual inspection around the world in the months after they are born. The amygdala, which will manage the by emotional part of the person develops about 11 months, and about 12 child begins to distinguish his mother from other women, it creates a special bond with his mother, and somewhere in that period recognize itself in the mirror, however, believes that time already developed self-awareness.
One of the misconceptions that the brain is not working or is working at reduced capacity when the rest has been scientifically disproved in a way that we now know that the brain is always working at 100% capacity. For that around 70% of the energy that we do not know what the brain uses often because he is now in the literature is the name of the dark energy of the brain which leads to the fact that we know about it about as little as the dark energy of the universe.

 

Memory and the consequences of memory


One of the primary mental abilities developed during evolution and led to the growth of the brain as the authority responsible for the management of other parts of the body, but also to interact with the environment, it is memory. Memory is developed slowly, evolutionarily speaking, but it was one of the main strategies of survival because the animals may have more to remember and could better create and use mind maps, and are therefore more able in nature. It is known, for example, that the fish catfish has the ability to remember only the last two seconds, however, this animal happens to repeatedly attach the hook the same in just a few minutes, because it is not able to remember and then the perceived danger. This property has been of crucial importance to all primates, and they all organisms develop through evolutionary development. On the other hand the phenomenon of uncontrolled memory large amounts of information quickly be brought to overload the nervous system and the collapse of the body, it was necessary to develop our ancestors and mental capacity to delete unnecessary information and cancellation of only those that are most essential to them in terms of evolutionary advantage. Natural concepts and strategies of survival were such that the number of synapses and neural connections, and the size of the processor as it increased the capacity of the same, which may explain the evolution of the brain and its morphological differences compared to our ancestors and animal relatives, while as this gradual process, should find a more efficient method to achieve results in terms of conservation of the species and is scratched deletion probably originated as a natural response to the need of harmonizing neurophysiological processing capacity and the capacity of our senses and transmission networks of nerve cells.
Modern Homo Sapiens memory stored in different parts of the brain possesses a neuro cortex control mechanism to access and control, and delete them. Problem neuro cortex when it comes to memory is reflected in the fact that this part of the brain is trying to the best moments in life keep forever, or so it is repeated in the form of memories, or tends to create impulses that led to repeat or creates a need / predisposed to the use of chemical supplements to restore it at a given state. On the other hand, speaking about the emotional side of the human personality is no other mechanism almost vegetative memory of emotions based on the work of the amygdala. The amygdala is the emotional well which can not be discharged, and which in some people in very old age overload leading to grumpiness and the anxiety for the same. A particular problem of memory represents the memory of emotional states (not just events), because emotions must be aversive to the remembered - when you would not be aversive would not remembered so could not cause a movement towards better - which is the main driving force development of human civilization.

Physiology and morphology of the brain


Feeding the brain consumes nearly a quarter of oxygen and glucose to 2/3, for which the evolution of nature created even 2,000 kilometers long network of blood vessels in the brain. At the same time it is important to emphasize that processes such hiopooxia and ischemia (reduced blood volume in the brain) may seriously and quickly compromise the functioning of the brain and lead to death. For 4 to 5 minutes after it left without sufficient amounts of blood and nerve cells deteriorate more in 10 minutes will be followed by death. 10-15% of people die in the world every year due to inadequate supply of blood to the brain that occurs due to atrophy and deterioration of blood vessels due to age ie, cracking capillaries and brain haemorrhage.
It is important to note that in the development of brain and nerve pathways neurons adhere to a clear development schemes to find that the first connecting vertically and then horizontally. Nerve cells migrate guided by chemical signals from our DNA until they arrive at a precise location in the body where they will settle permanently. Thus formed the nervous system extends from the brain and spinal cord to all the nerve endings, or sensory cells. During the evolution of the nerve cell synapses model developed to ensure connectivity of nerve cells.
Data transmitted on the outer world of the senses to the brain through the electrochemical reaction, ie, by bioelectrical currents and chemical reactions that occur at synapses where pulses are transmitted to the biochemical control of packets of information that are transferred (only the appropriate molecules that correspond to the key chemical standpoint can continue way through the membranes of nerve cells). In order to ensure that it reaches the junction of nerve cells, which could result from uncontrolled nerve electrochemical reaction in the brain (what normally happens during an epileptic seizure) nerve cells are wrapped Swann cells that serve as insulators and prevent short contacts between nerve cells , both of which stimulates white brain mass.
It is important to note that the opinion of the chemical process by its nature, and that the brain during thinking process uses different parts of the brain, and that it is therefore necessary to divert blood in those brain regions, respectively, to the blood stream of opinion is being redirected to regions that are accountable for the creation of intentions and decision-making in the brain. Repeating some turns of thought leads to frequent redirection of blood in the respective regions, and better blood circulation in the given region, and to the creation of "hardware base" for easier use given parts of the brain. This may be explained by a learning process, whether it is a motor or consciously learning. The same process is repeated when it comes to habits. On the morphological level of observing things habits that we are nothing more than establishing paths (routes of chemical and bioelectrical reactions and electric sparks) in our nervous system. Repeating some decisions strengthens the structure of the hardware path (build reliable neuro chemist infrastructure). Habits create a permanent path in our brain and we can be relieved only by creating other alternative path (here encourages and neuro-psychological opinion that for any habit we want to change we need to create a suitable replacement).

The basics of hypnosis theory of mind


The basics of hypnosis based on a theory of mind that psychologists explained as follows. Everything that happens to us (and there is), we observe the basic processes that take place in waving mind. The basic processes are the process of identification and associations that bind to the previous experience, ie, mind folder where it comes from a reaction to events in the environment. A person reacts when you connect femomen who has identified with the one in the mind map. At lower levels of the evolutionary development of human beings have developed primitive mind that controls the basic reactions stimuli from the external environment and makes the decision "fight" or "flee" (these are the basic decisions related to biological survival strategies). Since children are born without a fully formed brain development of critical thinking process is rounded up around 9 years old. Thus formed the mind made up 88% of the unconscious, and only 12% of conscious mental processes, what is the opinion that all mechanisms such as logic, common sense and rational thinking and decision-making processes. Since the mind is most often defined as software that is installed on the hardware called brain to him in addition to psychological connections that are created in the mind affect the biochemical processes that occur in the brain. As previously explained that the mind could not work interactively is necessary that the amount of information that the brain processes reduced to an acceptable level, ie. the one that is it possible that the nerve cells of the brain processing in real time. For this it is necessary that the filters exist and the messages that are sent to the brain for processing to be structured, depending on the information that should be processed. So we can say that packages the message depends on the environment, the body, the conscious and subconscious thought processes of mental process. Only packages processed (classified) according to these criteria, the brain can process. As the number of information that we want to transmit and process the brain is growing so we come to the turning point when the brain is no longer able to receive and process new information, and mental filters that are in place to reduce the number of information which a critical part of the mind processes the fall (they become inactive), which results in that new information due to the brain due to the processing without passing through the filters, and the person becomes susceptible to suggestion as a basis for the application of hypnosis, which basically is nothing more than unimpeded access (/ direct access to) certain parts of the brain, ie, the mind.
Each person is able to recognize themselves in a mirror, but in this image also has an internal image of themselves that is skewed so that each person is different sensory and motor pegman, which is of vital importance for the orientation and functioning of people, especially in terms of motor functions of the organism and the ability of orientation in space.
The human experience of reality would soon be described as follows: when an external event occurs, we first create an internal (or private or personal) the performance of the event (or abbreviated: I/P). The I/P is then combined with our physiology and to finally create something we call the state. When we say "state" we mean the emotional state of the individual.  And so, when you consider that the life of man is nothing else than a big variety of different conditions in which enters, then the value of understanding the components that influence the creation of these conditions becomes more than obvious. The situation in which you are currently located is a combination of your internal pictures, sounds, feelings and inner dialogue. Also any other state in which you can find also the result of these factors.
Deleting the ability of the mind to create a mental picture of focusing on a relatively small number of key details while excluding all non-essential elements of reality. It is estimated that in one second our nerves to the brain transmit impulses around 4,000,000,000, while our consciousness is only processed around 2,000 pulses transmitted. When there would be no deletion of awareness would have to deal with the information overload that it has no the capacity (we remind you that so far the most accurate estimates say that the human brain can handle 7 bit per second) with a delay, which would be increased over time so our consciousness is no longer dealt a reality than the past, which would lead to the fact that such a body to the Darwinian laws of nature could not exist - or would have to disappear and be replaced by more successful species that are better adapted to reality. Neuroplasticity of the brain is enabled:
·         chemical changes
·          Structural changes
·         Functional changes
It should be noted that the latest research shows that short-term memory based on current chemical changes in the brain, while the long-term memory certain structural changes in the brain that are manifested in the establishment of new connections between the regions and of certain parts of the brain. The question is what are the limits of neuroplasticity? For now there is no answer to this question. What we know today is that the brain is capable of neuroplasticity, or the transformation in the physiology and structure throughout a person's life. Neuroscientists have found that the brain determines everything you do and everything you are not working, and the best way to neuroplasticity behavior (behavioral changes cause changes in the brain). Also for neuroplasticity is also significant exercise (which is the second most influential factor in neuroplasticity).

петак, 16. децембар 2016.

Testing the Internet as a global comprehensive ICT system



Testing ICT systems represents one phase of any project, but also a measure of quality assurance for products and services of ICT technology. Unlike other industrial testing system tests are performed at three different levels and at the level of meeting the technical specifications for hardware, functional and technical requirements for software and system behavior in different environments and boundary conditions. Also, as a special type of testing is examined to prove the reality.
Most often when testing using best industry practices, but often it is necessary to expand testing beyond those areas that do know the industry or by the system designer had the idea for the system under the given conditions and in order to give way. This is the case whenever the system is operating outside the border, that is, when manufactured and installed the system designer or exceeds the expectations of the contracting authority so users are discovering the many unforeseen ways to use the system.
Standard testing system covers borderline cases, performance or design conditions do not predict, inputs and methods of use and is usually carried out for the devices directly over them while the large and spatially distributed systems commonly tested model of system in which it is very important to model as closely as possible represents your system.
Tests are carried out so that collectively cover all the complex specifications of the system as well as all possible outcomes, with the number of different tests limited to a reasonable level so that the new testing methods are introduced, and only if the existing features of there is a possibility of a new outcome by applying a given test - there is no need to prove what has already been proven.
It is worth mentioning that the tester examines and proves the reality and that the tests conducted on the given mathematical models cannot be more than the accuracy of the accuracy of the model, so that every inconsistency or error in the model automatically leads to errors in testing.
Perhaps the best recommendation for the testers to be professional skeptics that do not believe none of the pictures, scheme or information obtained from the developers of system, operatives or owners of the system. One of the reasons why you should be skeptical of everything is a phenomenon that is often necessary to simplify the ideas that is being implemented in order to fit the same in certain technical standards, and that could easily have presented to those who work with them should be introduced only at the level of ideas, not to possess a deeper knowledge in the field. Testers then have to measure and verify the reality and believe only the data that provoke real system, and that the evaluation of results trying to keep it a stronger critical line.
Simplification as we have already noted the problem drains, because the data obtained from the real system differ significantly from those obtained from the model.
Special characteristics of the testing system is the fact that no one not even the designers nor testers do not know everything about what and how system works. This is because the system has wider limits than those observe. In this sense there is a problem that is part of the system must be observed and measured, and which do not, or that are relevant to the characteristics and the relevant parts of the system interdependent relationships.
The question that arises when there are some discrepancies and anomalies system is whether these problems are important for the whole system or local, that affect other elements of the system, that is transmitted and how to escalate. Since most professionals in the ICT sector have no education from systems theory most local problems is considered isolated and local, and the fact that the system each element system affects every other element system and the system as such is often forgotten or consciously disregarded whereby only the question whether the accumulated small problems escalate enough to threaten the functioning of the whole system.
In this respect it should be noted that in theory all the of system has a context. This means that testers must evolve context before testing the system. This is particularly important when testing software on the extra functionality, ie, the gray area of application, device or software when working outside the projected value. The unwritten conditions require the use of checks and those conditions that were not originally planned in the project. In this way, the examination of real solutions expands the area in which the reliability of the device or software can be used. This type of test has so far been mainly reserved for the software components of system, but rarely for infrastructure, but due to the rapid growth of infrastructure and continuous overload due to the accelerated growth of the services that use the same this kind of testing has become a very interesting and in the area of infrastructure.
In this sense it is interesting to check the growth scenarios, which can easily be described as an attempt to understand how the project will fall apart if exaggerate any of the elements of system, ie, in this particular case the network infrastructure to fall when the network appears too many simultaneous services that attempt to benefit at the same time (the problem of lack of capacity). In this way, testing the system becomes a engine for constant deep learning about the system.
Testing system must not be so expensive that company or institution does not want to do them, but not so cheap that due to the fluctuation of knowledge on testing techniques become random. Therefore, it is important to note that many so-called automated testing technically are not testing but only characteristics proof .
Automated tests cannot be aware of the emergent problems, especially if it occurs in the time between two automatic testing. In this regard it is important to always perform system testing and application of human intelligence, because at the moment there is no automated software system or artificial intelligence service enough complex, precise and accurate which could meet the requirements of automated testing, while still maintaining critical awareness.
Also, control of the work of testers is particularly difficult in scripted tests. However, tools such as he Panaya Scenario Recorder allow you to create appropriate test scenarios that could be automatically used to replace human testers work in certain areas of freeing people for the necessary training in the field of testing. Testers need to stay sharp in order to properly do their job.
If we look at the work of testers in the ICT industry will notice that in this industry testing almost everything, every product or service, software, hardware, system elements, but never carried out extensive testing by full global infrastructure utilization. Although there is a very large number of tests that can consistently implement the elements of Internet infrastructure nobody has ever conducted a real full scale crash test of Internet. Due to the size and complexity, as well as countless technical redundancy of the parts of the software and hardware infrastructure, there is a question whether this is feasible. But how, not only our economy, government and legal system depends very much on the Internet, but you can also say that most of the lower social infrastructure (companies, families, associations, cities, personal connections) directly and inextricably linked to a large degree dependent on the Internet raises the question is whether such a test is necessary and what would be able to learn from it and learn.
If we agree that it is necessary, at least in terms of determining the limits of resistance system it is necessary to re-envision how to make this test work conducted in such a way that the consequences that its implementation and enforcement are minimized. Certainly knowing the limit of resistance system would help to more clearly the need for redundancy institutions and develop procedures for disaster recovery. The development of these procedures is necessary and it can be said is critical to minimize the effects of disasters, but also crucial for the operational management of the global network in the case until the fall of the Internet in its entirety and come.
Since it is difficult to assume that the planet could engage a natural disaster that completely destroyed the infrastructure of the Internet, but at the same time was such that the human race is able to survive come to the conclusion that the only likely scenario in this case the operation of malicious individuals or groups united around this objective.
Of course, such an ambitious goal is hardly feasible without a huge technical and human resources required to implement such a project at work. And if there is a group with this aim it is clear that this is a terrorist group or as likely cyber power some of the countries that have this kind of security forces.
The fact that from year to year on the Internet appears more and more malicious software, that frequent destructive attacks the thesis that in the foreseeable future possible hypothetical scenario in which to let the militant group could decide to attempt to overthrow the Internet in its entirety. Of course, the question is what would be motivated, and by what means and techniques make this attempt was implemented, but it is obvious that the probability of such an event is growing in the future, so it is not illogical to assume that it will nor be some kind of global testing the Internet in the future which would could be implemented at the level of physical and logical infrastructure.
For now, there are a number of ways the Internet as a global verification system using a simulation model or on their characteristics and methods of verification largely silent, mostly for security reasons, as confirmed successful scenarios are nothing but possible plans for attacks. In addition to include the observation of the real behavior may simplify the infrastructure of these models require considerable processing power, and the results obtained by them are not the most reliable.
Another way to check the resistance of a global network of testing for reduced size, but such testing also have a lower degree of complexity, ignoring the synergetic effects that increase with the size.
In general it can be said that the inquiries into real resistance indispensable types of Internet checks for a global infrastructure, but also that due to the size and complexity of the same, is not possible in reality, however, resorted to testing models that give only a vague and approximate picture of the behavior of the global network conditions close to the fall of the network.
Although extremely significant hopefully give us this kind of testing data collected will never need in reality and that the models generated by consolidation and evaluation remain the same only the necessary security protocols that we will never need to use this data in order recovered global network from disaster of such proportions.

среда, 14. децембар 2016.

MES and Industry 4.0



In the past two years quite a lot of talk about the concept of Industry 4.0, which is slowly growing from the political project of the German government in a widely accepted standard which the industry aspires. The basic idea in this concept is the division of industrial capacities to the layers to help them that ICT technologies easily more directly could be applied. A particular with the application of appropriate software solutions for each level and the problem of their interconnectivity and the free transfer of information between the layers as well as to direct requests from the management, maintenance or technical management.
Basic principles of industry 4.0:
• Interoperability
• Virtualization
• Decentralisation
• Service orientation
• Modularity
• Ability to access to information in real time
4.0 Industry as a concept based on cyber-physical systems and their interactions, as well as technologies IoT (Internet of Things, IoT) and Internet services.
In addition to the now ubiquitous software for management and control of company resources such as ERP (Enterprise Resource Planning) software, and CAD/CAM software (computer-aided design and computer-aided manufacturing) within the industrial capacities that seek speedy implementation of the concept of industry 4.0 there is another intermediate layer of software solutions linking the two, which is usually abbreviated as MES.
MES (Manufacturing Executive System) in practice is a system that connects the third level operating system (software, typically ERP classification) with the control and monitoring of production and manufacturing operations in real time.
MES software functionality:
• Optimize daily operations
• Import data from CAD / CAM software
• Tracking parts through the manufacturing process
• Informing all interested parties that there is a damaged part of (and offering answers to the question why?)
• Offering a clear answer when we have done part of the installation
• Time plans and financial reports
Benefits of MES systems are: small delays in manufacturing, increased productivity, higher efficiency, full integration - your entire business works smoothly as a single operation.
A special advantage of MES system is reflected in the finding and understanding the delays in production, and their solution which increases overall equipment effectiveness (OEE). This is especially important if it is known that due to the visibility of all processes possible to calculate the effects of certain delays and bottlenecks, and evaluate the financial effects of their rehabilitation using any other technical solution which is known in advance the time of return on investment while reducing business risk.
Overall equipment effectiveness is a multi-dimensional indicator efficient growth, that is the common factor that reduces all other performance indicators from various subsystems that they could be compared. The introduction of this indicator it is possible to compare the efficiency expressed in time, fuel, materials and energy, as well as other measures of efficiency of individual systems. There are several factors that determine OEE: availability, performance, quality, or no matter which factor is dominant in this particular case, a methodology that will be applied for measuring the efficiency, OEE and MES system is applicable to each production companies.
When selecting and implementing MES solutions it is necessary to adhere to the best recommendations resulting from industrial practice. This applies particularly to the fact that the core team for implementation of MES solutions must be composed of persons who possess expert and expert knowledge for all parts of the system, especially for those keywords such as operations, IT, engineering staff specialized in equipment used ( electrical, mechanical, pneumatics, hydraulics, statics and dynamics of the system) as well as the persons responsible for the supply chain and quality. It is recommended that teams depending on the complexity of the task should not be less than 3 to 6 experts. Another important recommendation is that analysts sistrema be time-limited in terms of giving their recommendations, because only a time limit prevents the system watching too granular, ie, at a higher level of complexity than that poterebana for the successful implementation of MES solutions. We should also bear in mind the fact that MES projects are mostly long-term and lasting at least half of the economic cycle (4-5 years, depending on the complexity and budgetary capabilities of the company). And that due to the complexity can not be conducted on the basis of information obtained predefined generic questionnaires as selection system depends directly on the work flows and functional requirements and complete functional specification process, materials, information and energy is the basic input for the design and selection system.
It is also important to note that despite a traditional opinion that such an advanced IT systems are expensive, and that is necessary for their implementation devote the time in practice often arise situations in which ad hoc solutions achieve significant effects. Both are known examples of the implementation of individual MES solutions in a well structured and regulated production systems, implementation of some (vital) functionalities MES solution was ordrađena within the period of 3-4 days starting with a budget of 10,000 $.
It is noteworthy that in the selection of MES solutions need to be taken to predefined solutions that come as part of an ERP solution (this is particularly true for SAP) are nothing more than the corresponding batch procedures and that as such they do not offer real-time data but by an advance selected dynamics.
Without going into the technical details of the selection MES solutions as well as elements of the same interconnectivity leaning towards other solutions it is obvious that the application of such a solution we come up to full production line visibility and clear qualifying event on it.
Each project involves system integration cross domain skills of engineers and managers and must be initiated in view of the objective that we want to achieve with the design of the system will only go back to the necessary conditions, resources and design solutions.
However, that whatever the methodology to be managed by the manufacturer in the selection and implementation of MES solutions is the fact that the implementation of such a solution greatly accelerate the business and make it sufficiently transparent that the technical staff and managers within the production system. In today's business environment where they expect quick answers to new challenges and threats from the environment such a system provides undeniable competitive advantage is logical to assume that the trend accelerated introduction of MES system to continue in the future.