<?xml version="1.0" encoding="ISO-8859-1"?><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<front>
<journal-meta>
<journal-id>0012-7353</journal-id>
<journal-title><![CDATA[DYNA]]></journal-title>
<abbrev-journal-title><![CDATA[Dyna rev.fac.nac.minas]]></abbrev-journal-title>
<issn>0012-7353</issn>
<publisher>
<publisher-name><![CDATA[Universidad Nacional de Colombia]]></publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id>S0012-73532010000300026</article-id>
<title-group>
<article-title xml:lang="en"><![CDATA[ANALYSIS AND REVIEW OF THE CONTRIBUTION OF NEURAL NETWORKS TO SAVING ELECTRICITY IN RESIDENTIAL LIGHTING BY A DESIGN IN MATLAB]]></article-title>
<article-title xml:lang="es"><![CDATA[ANÁLISIS Y ESTUDIO DE LA CONTRIBUCIÓN DE LAS REDES NEURONALES AL AHORRO DE ENERGÍA ELÉCTRICA EN ILUMINACIÓN RESIDENCIAL MEDIANTE UN DISEÑO EN MATLAB]]></article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname><![CDATA[ROMERO]]></surname>
<given-names><![CDATA[RICARDO]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[GIRAL]]></surname>
<given-names><![CDATA[DIEGO]]></given-names>
</name>
<xref ref-type="aff" rid="A02"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[HERNANDEZ]]></surname>
<given-names><![CDATA[CESAR]]></given-names>
</name>
<xref ref-type="aff" rid="A03"/>
</contrib>
</contrib-group>
<aff id="A01">
<institution><![CDATA[,Distrital Francisco.Jose de Caldas. University  ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<aff id="A02">
<institution><![CDATA[,Distrital Francisco.Jose de Caldas. University  ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<aff id="A03">
<institution><![CDATA[,Distrital Francisco.Jose de Caldas. University  ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<pub-date pub-type="pub">
<day>00</day>
<month>09</month>
<year>2010</year>
</pub-date>
<pub-date pub-type="epub">
<day>00</day>
<month>09</month>
<year>2010</year>
</pub-date>
<volume>77</volume>
<numero>163</numero>
<fpage>248</fpage>
<lpage>259</lpage>
<copyright-statement/>
<copyright-year/>
<self-uri xlink:href="http://www.scielo.org.co/scielo.php?script=sci_arttext&amp;pid=S0012-73532010000300026&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.org.co/scielo.php?script=sci_abstract&amp;pid=S0012-73532010000300026&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.org.co/scielo.php?script=sci_pdf&amp;pid=S0012-73532010000300026&amp;lng=en&amp;nrm=iso"></self-uri><abstract abstract-type="short" xml:lang="en"><p><![CDATA[In this document is presented the implementation of the programming schedules as a method of lighting control, to perform a total saving and a personalized saving using neural networks. With the acquisition of a series of data about the operation of five lightings located in different parts of a specific house, it was designed a neural network to illuminate it and was implemented this design to the remaining. These neural networks were trained with input vectors; hour of the day, day of the week, holiday Monday's and their respective objective vectors "total saving and personalized saving" with the purpose of evaluating the performance of the neural networks in the optimization of methods for saving electric energy in residential lighting.]]></p></abstract>
<abstract abstract-type="short" xml:lang="es"><p><![CDATA[En este documento se presenta la implementación de la programación horaria como método de control de iluminación, para realizar un ahorro total y un ahorro personalizado, utilizando redes neuronales. Con la adquisición de una serie de datos, sobre el funcionamiento de 5 luminarias ubicadas en diferentes partes de una casa específica, se diseñó una red neuronal para una luminaria y se implementó este diseño para las restantes. Estas redes neuronales fueron entrenadas con los vectores de entrada; hora al día, día a la semana, lunes festivos y sus respectivos vectores objetivo "ahorro total y ahorro personalizado" Con el fin de evaluar el desempeño de las redes neuronales en la optimización de métodos para el ahorro de energía eléctrica en iluminación residencial.]]></p></abstract>
<kwd-group>
<kwd lng="en"><![CDATA[Neural network]]></kwd>
<kwd lng="en"><![CDATA[time schedules]]></kwd>
<kwd lng="en"><![CDATA[total saving]]></kwd>
<kwd lng="en"><![CDATA[personal savings]]></kwd>
<kwd lng="en"><![CDATA[lighting control]]></kwd>
<kwd lng="es"><![CDATA[Red neuronal]]></kwd>
<kwd lng="es"><![CDATA[programación horaria]]></kwd>
<kwd lng="es"><![CDATA[ahorro total]]></kwd>
<kwd lng="es"><![CDATA[ahorro personalizado]]></kwd>
<kwd lng="es"><![CDATA[control de iluminación]]></kwd>
</kwd-group>
</article-meta>
</front><body><![CDATA[ <p align="center"><font size="4" face="Verdana, Arial, Helvetica, sans-serif"><b>ANALYSIS   AND REVIEW OF THE CONTRIBUTION OF NEURAL NETWORKS TO SAVING ELECTRICITY IN   RESIDENTIAL LIGHTING BY A DESIGN IN MATLAB </b></font></p>     <p align="center"><i><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>AN&Aacute;LISIS Y   ESTUDIO DE LA   CONTRIBUCI&Oacute;N DE LAS REDES NEURONALES AL AHORRO DE ENERG&Iacute;A   EL&Eacute;CTRICA EN ILUMINACI&Oacute;N RESIDENCIAL MEDIANTE UN DISEÑO EN MATLAB</b></font></i></p>     <p align="center">&nbsp;</p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>RICARDO ROMERO</b>    <br>   <i>Technician in Electricity, Distrital   Francisco.Jose de Caldas. University, Electrical Engineering Student, <a href="mailto:ricardo.romero.romero@gmail.com">ricardo.romero.romero@gmail.com</a></i></font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>DIEGO GIRAL</b>    <br>   <i>Technician in Electricity, Distrital   Francisco. Jose de Caldas.   University, Electrical Engineering Student, <a href="mailto:diego_giral@yahoo.com">diego_giral@yahoo.com</a></i></font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"> <b>CESAR HERNANDEZ</b>    <br>   <i>Electronic Engineer,   Distrital Francisco. Jose de.Caldas.   University, Professor, <a href="mailto:cahernandezs@udistrital.edu.co">cahernandezs@udistrital.edu.co</a></i></font></p>     <p align="center">&nbsp;</p>     ]]></body>
<body><![CDATA[<p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>Received for review February 10<sup>th</sup>, 2010, accepted   June 16<sup>th</sup>, 2010, final   version June, 18<sup>th</sup>, 2010</b></font></p>     <p>&nbsp;</p> <hr>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>ABSTRACT:</b> In this document is   presented the implementation of the programming schedules as a method of   lighting control, to perform a total saving and a personalized   saving using neural networks. With the acquisition of a series of data   about the operation of five lightings located in different parts of a specific   house, it was designed a neural network to illuminate it and was implemented   this design to the remaining. These neural networks were trained with input vectors;   hour of the day, day of the week, holiday Monday's and their respective   objective vectors "total saving and personalized saving" with the purpose of   evaluating the performance of the neural networks in the optimization of   methods for saving electric energy in residential lighting.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>KEYWORDS:</b> Neural network, time schedules,   total saving, personal savings, lighting control.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>RESUMEN: </b>En este documento se presenta   la implementaci&oacute;n de la programaci&oacute;n horaria como m&eacute;todo de control de   iluminaci&oacute;n, para realizar un ahorro total y un ahorro personalizado,   utilizando redes neuronales. Con la adquisici&oacute;n de una serie de datos, sobre el funcionamiento   de 5 luminarias ubicadas en diferentes partes de una casa espec&iacute;fica, se diseñ&oacute;   una red neuronal para una luminaria y se implement&oacute; este diseño para las   restantes. Estas redes neuronales fueron entrenadas con los vectores de   entrada; hora al d&iacute;a, d&iacute;a a la semana, lunes festivos y sus respectivos   vectores objetivo "ahorro total y ahorro personalizado" Con el fin de evaluar   el desempeño de las redes neuronales en la   optimizaci&oacute;n de m&eacute;todos para el ahorro de energ&iacute;a el&eacute;ctrica en iluminaci&oacute;n   residencial.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>PALABRAS CLAVE:</b> Red neuronal, programaci&oacute;n horaria, ahorro total, ahorro personalizado,   control de iluminaci&oacute;n.</font></p> <hr>     <p>&nbsp; </p>     <p><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>1. </b> <b>INTRODUCTION</b></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Nowadays is very common to see   the increase of the cost in the electric energy service; which makes people to   be concerned about energy consumption in their houses, provoked by multiple   electric devices present in these places. A large part of this consumption is   owed to basic necessities in the house as illumination and food refrigeration.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Taking in account the actual   saving necessities, it is necessary to consider some extra expenditure that is   generated as consequence of an inadequate use of electric energy, among, and   the most frequent is to keep on devices that are not being used in certain   moments, as for example, the most common is keeping turned on lamps or light   bulbs.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Therefore, arises the necessity   to optimize, develop new systems and/or methods that allow to make electric energy savings in homes,   by illumination control. Today there are various methods to control   illumination, among which we find the scheduling programming; where we can   program the turning off control, turning on and regulation of the illumination   according to the time of the day, and the day of the week. &#91;1&#93;.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In this document is tried to   evaluate the performance of neural networks, in the optimization of the work of   5 lights in a specific house upon the definition of a schedule programming   method. It is implemented this method given that the neural networks are   intelligent models that search to reproduce the behavior of the brain, with the   capacity of being an adaptable system to any application.&#91;2&#93;</font></p>     <p>&nbsp;</p>     <p><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>2. DESIGN AND METHODOLOGY</b></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">To perform the design of a   neural network that allows predicting the working of lighting, many factors   have to be taken in consideration, as:</font></p> <ul>       <li><font size="2" face="Verdana, Arial, Helvetica, sans-serif"> Objective Vectors and Entry Vectors</font></li>       <li><font size="2" face="Verdana, Arial, Helvetica, sans-serif"> Data acquisition</font></li>       <li><font size="2" face="Verdana, Arial, Helvetica, sans-serif"> Structure of the neural network</font></li>     </ul>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>2.1   Objective Vectors and Entry Vectors    ]]></body>
<body><![CDATA[<br>   </b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The selection of the objective   vectors and the entry vectors is the first step for the design of a neural   network with application in the prediction of data.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><i>2.1.1   Objective Vectors    <br>   </i>The objective of this work is   to evaluate the performance of the neural networks in the optimization of the   method of schedule programming for energy saving, through a total saving and a   personalized saving, which arises the necessity to leave for each light two   objective vectors. In <a href="#tab01">table 1</a> is realized a description of the numeric values   that were assigned according to the characteristics of the variable:</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab01"></a>Table 1.</b> Exits for the control of Schedule programming</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab01.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Total saving: Is   made by 2 factors, turning on or off the light, facilitating the assignation of   binary values, representing the " 0"   as off and the " 1"   as on.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Personalized saving:   Is conformed by 5 regulation factors desired in the illumination of the house (0%,25%,50%,75% y 100%), and therefore represented in decimal   values from 0 to 1 with steps of 0.25 (equation 1 was used to assign the decimal   values to the regulation percentages)</font></p>     <p><img src="/img/revistas/dyna/v77n163/a26eq01.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Where: Y is the   numeric value of the desired regulation and n can take the values from 1 to 100 at intervals of 25.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><i>2.1.2   Entry Vectors    ]]></body>
<body><![CDATA[<br>   </i></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">It is indispensable to   establish entry factors that allow generating patterns related to the exit   values in the training of the neural network, so it can determine the   consumption of electrical energy of the lighting in the houses.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">On other hand, the non   predetermined conditions of the environment have to be taken in consideration,   as: alternation of the fixed situations or modification of the established   times, such changes can be defined as non predictable factors and require a   filtration process to avoid the sensible alteration of the model. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b> Factors that   affect the exits of the network    <br>   </b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In the method of   Schedule programming is used the hour of the day and the day of the week to   determine the performance behavior of the lightings. Therefore the selection of   the factors; hour, day and holiday Monday, are defined as the entry data in the   simulation of the neural network. Following is explained the selected variables.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"> a. Hour of the day: This is one of the variables with more influence in the use of lightings in   homes, due that it is evident that according to the hour of the day people have   the necessity to turn on or off the lights. To introduce this factor and trying   to give more precision in the functioning times of the lights, it is used a   numerical variable as an entry vector for the neural network, the range will   oscillate between 0 and 23.75 with ranges of 0.25, that represent intervals of   15 minutes (Equation 2).</font></p>     <p><img src="/img/revistas/dyna/v77n163/a26eq02.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Where; Y is equal to the   numeric value of the hour of the day and n= can take values from 0 to 60 in intervals of 15 minutes   equivalent to ¼ of an hour.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="#tab02">table 2</a> is shown an example   of the numeric value that is used to represent   the 15 minute intervals.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab02"></a>Table 2.</b> Example of the hour of day's conversion   into numeric variables</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab02.gif"></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"> b. Day of the week: The day of the week affects the   performance of the lightings, a very clear example is comparing a Sunday with a   Tuesday, given the fact that the Sunday is a non working day changes the   routine behavior of people changing the use of appliances and lights. This   factor will be codified as shown in <a href="#tab03">Table 3</a>.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab03"></a>Table 3.</b> Day of the week and its   corresponding numeric value</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab03.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">c. Holliday Monday's: it is evidenced that there is a   great difference between a working Monday and a holiday Monday, as mentioned   before it affects the functioning habits of the lights. This variable will be   included in the entry vector with a binary numeric value, represented "   1" as holiday Monday and " 0" as working Monday.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Finally is designed the vector   schematics for entries and objectives,   as shown in <a href="#fig01">figure 1</a>.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="fig01"></a><img src="/img/revistas/dyna/v77n163/a26fig01.gif">    <br>   Figure 1.</b> General schematic of entries   and objectives of the neural network to be designed</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>2.2 Data acquisition    <br>   </b>Once determined the entry   values and the objective vectors, the data gathering is initiated on the   working of the 5 lightings, located in different parts of a specific house   during 11 weeks, the procedure to capture this information is made upon a study   of the schedule established by the routines of the people, also for this work   was given the collaboration of the members of the house. In <a href="#tab04">Table 4</a> is shown an   example of how the data was gathered. In total was gathered 295.680 data of   which about 20% is left as validation and the remaining 80% is taken to train   the network as shown in <a href="#tab05">Table 5</a>.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab04"></a>Table 4.</b> Example of the data gathering located in the dinning room, after 3 PM   until Saturday 4 PM</font>    ]]></body>
<body><![CDATA[<br>   <img src="/img/revistas/dyna/v77n163/a26tab04.gif"></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab05"></a>Table 5.</b> Distribution of the data for training and validation of the network</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab05.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>2.3 Structure of the neural network    <br>   </b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Upon this point it is started   the design of the neural network through Matlab's toolbox Neural Network   (nntool); this simulation tool allows the selection of: net type, training   function, number of layers, number of neurons and transfer function by layer,   taking in account the entry vectors and objective vectors, in <a href="#fig01">Figure 1</a> is   presented the general schematics of the neural network. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><i>2.3.1 Type of neural net    <br>   </i></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">For the selection of the type of network were considered the following   parameters: type of training, desired objectives, number of hidden layers,   processing capacity used to train the network, this last aspect can cause time   optimization problems in the iterance's and can block the equipment to be used   with the following characteristics: Core 2 duo CPU T5550 1.8GHz with 2GB RAM and   pagination file of 2.046 GB.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">After making a comparative chart between the 3 different types of neural   networks used for the prediction as shown in <a href="#tab06">Table 6</a>, it is evident the   implementation of the Feed-Forward Backprop net. This type of network is one of   the most used in the prediction of patterns today, because when it emits a   result, it is compared with the desired exit and calculates the error made.   Upon this moment the exit layer makes an inverse propagation of the error   towards the hidden layers, recalculating the weights so the error is minimized   in the next iteration. As the net is trained, the neurons of the intermediate   layer learn to recognize the characteristics of the entries (entry patterns) &#91;2&#93; &#91;3&#93; &#91;4&#93;.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab06"></a>Table 6.</b> Restrictive parameters for selecting the type of net to be used</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab06.gif"></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><i>2.3.2 Red Feed -Forward   Backprop parameters    <br>   </i></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">By selecting the Feed-Forward   Backprop network, the toolbox Neural Network provides a series of (algorithms -   parameters) that vary according to the application that wants to be given to de   neural network, the parameters are:</font></p> <ul>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">Input ranges</font></li>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">Training function </font></li>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">Adaption learning function</font></li>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">Performance function </font></li>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">Number of layers </font></li>     </ul>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">This is observed in the toolbox Neural Network   as shown in <a href="#fig02">figure 2</a>. Following is explained the repercussion of the previous   items in the design of the network.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="fig02"></a><img src="/img/revistas/dyna/v77n163/a26fig02.gif">    ]]></body>
<body><![CDATA[<br>   Figure 2.</b> Selection window of parameters of the toolbox Neural Network</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">a. Entry ranges    <br>   </font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">This item in fact is not a   parameter, but a box that allows charging the entry vector to obtain the   numeric values upon it is made.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">b. Training function    <br>   </font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">This parameter allows choosing   the type of training algorithm used by the neural network, according to the   research made for the prediction and application to practical problems it is   recommended the use of 4 training functions. &#91;5&#93;&#91;6&#93;</font></p> <ul>       <li><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Trainlm: (defined     by defect) requires storage capacity, converges in a high number of iterations,     algorithm that updates the weights and gains according to the optimization of Levenberg-Marquardt </font></li>       <li><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Trainbfg: this     algorithm requires more storage capacity tan the traditional algorithm, but     generally converges in less iterations; is an alternative algorithm that     employs the conjuxed gradient technique, its mathematical expression is derived     by the Newton method,</font></li>       <li><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Trainscg: Requires less     storage capacity, converges in less iterations, training of retro propagation     of escalated conjuxed gradient.</font></li>       <li><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Traingdx: Requires     less storage capacity, converges in a high number of iterations, training of     retro propagation of descending gradient with momentum and adaptative learning rate.</font></li>     </ul>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">For the design of the neural   network were made a series of tests that gave the better training function, the better tests can be   found on <a href="#tab08">Table 8</a>.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">c. Learning adaptation    <br>   </font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The toolbox of Neural Network allows performing   the 2 parameter variation</font></p> <ul>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">Learngd: Learning applying descending gradient &#91;6&#93;</font></li>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">Learngdm: Learning applying descending gradient with     momentum &#91;7&#93;.</font></li>     </ul>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Because   it can not be established which one of both gives the better adaptation, tests   were made to verify the most optimum one, as shown in <a href="#tab08">Table 8</a>.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">d. Performance function    <br>   </font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">As its name indicates, this   function allows observing the performance of the neural network in its   training, providing the error in it.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Matlab gives the option of 3   functions of execution.</font></p> <ul>       ]]></body>
<body><![CDATA[<li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">MSE: Middle quadratic error of the performance     function.</font></li>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">MSEREG: Middle quadratic error with the performance of     the regularization function.</font></li>       <li> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">SSE: Sum performance function quadratic error.</font></li>     </ul>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">According to the   preceding explanation, the 3 functions are used to measure the training error;   because of this was selected the parameter already established by Matlab's toolbox   Neural Network "MSE". &#91;6&#93;</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">e. Number of layers   and properties of the layers    <br>   </font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">For a better design of the   neural network, Matlab allows to modify:</font></p>     <p> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">. Number of layers    <br>   For the selection   of the number of layers were made error tests with 1, 2, 3 and 4 layers; the   best results are shown in <a href="#tab08">Table 8</a>. Upon the gathered data is analyzed that the   number of layers that has a better behavior with respect to performance is of   3.</font></p>     <p> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">. Number of neurons    ]]></body>
<body><![CDATA[<br>   As for the number   of layers this characteristic is obtained upon tests, the main results are   shown in <a href="#tab08">Table 8</a>, in   which can be observed that the number of neurons for each layer that adapts   better to the training of the net is &#91;20-30-2&#93;. It is worth mentioning that the   objective of the net is to predict two exits (total savings and personalized   savings); because of that the number of exit neurons of the network will be 2.</font></p>     <p> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">. Transfer function by layer    <br>   Matlab counts with   3 functions per layer, these functions are described in <a href="#tab07">Table 7</a> for the   selection of the transference function has to be performed a series of tests   that allow to determine the best model. The exit of the last layer was designed   under the criteria of a PURELIN transfer function, due that this linear   transference function provides the desired exits for the intermediate values   between 0 and 1 for the personalized saving (<a href="#fig03">Figure 3</a>), for the remaining layers the best result was:</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">First Layer: <i>Tansig    <br>   </i></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Second   Layer<i>: Tansig    <br>   </i></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Third   Layer:<i> Purelin</i></font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab07"></a>Table 7.</b> Transference Function by Layer</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab07.gif"></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="fig03"></a><img src="/img/revistas/dyna/v77n163/a26fig03.gif">    <br>   Figure 3.</b> Representation of the numeric values of the objectives</font></p>     ]]></body>
<body><![CDATA[<p>&nbsp;</p>     <p><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>3. RESULTS</b></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>3.1 Neural network structure    <br>   </b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">According to the objective of   the project a simulation of a neural network for the 5 lightings with their   respective exit vectors was tried (Total saving and personalized saving), but   due to the amount of memory required to perform this simulation it was not   possible its development. For that reason the decision to make a neural network for each lighting was taken obtaining a total of 5   networks. For the selection of parameters the tests were made with only one   light, given that the structure of the data is similar. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">With the help of Matlab's   toolbox was made a total of 281 simulations; with the goal of obtaining the   configuration with the lowest performance (highest adaptation).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="#tab09">Table 9</a> are   observed the obtained results in the 281 simulations made for the light located   in room number 3, where can be appreciated the 3 best results of the four   training algorithms with the parameter variation of the Feed-Forward Backprop   network that finally provide the configuration most adaptable to the data.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab09"></a><img src="/img/revistas/dyna/v77n163/a26tab09.gif">    <br>   Table 9.</b> Performance of the network and absolute errors</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="#fig04">Figure 4</a> is he final model obtained. The structure   is composed by 3 layers, the first contains 20 neurons with a Tansing   transference function, the second layer composed of 30 neurons and with a   transference function equal to the first one, the exit layer with a Purelin   transference function, it is clear that the third layer was designed upon the   objective vectors.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="fig04"></a><img src="/img/revistas/dyna/v77n163/a26fig04.gif">    ]]></body>
<body><![CDATA[<br>   Figure 4.</b> Final   design of the neural network</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>3.2   Simulation result    <br>   </b>In <a href="#tab09">Table 9</a> can be found the   performance given by Matlab (middle quadratic error of the function) for each   light, as the averages of the absolute errors for each objective value   Performance is, Goal 0. &#91;7&#93;.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="#fig05">figure 5</a> is the training   made by the neural network for each one of the lights, where can be observed   that the graphics start with a high error and goes diminishing until it   stabilizes, with the training it is appreciated the validation made by Matlab   that allows to minimize the number of iterations, the value of the performance   for each network is found in <a href="#tab09">Table 9</a>. </font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="fig05"></a><img src="/img/revistas/dyna/v77n163/a26fig05.gif">    <br>   Figure 5.</b> Graphics Training lightings kitchen, dinning room, study room, H (1) y   H (3)</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><a href="#fig05">Figure   5(a)</a>: training for lighting located in the kitchen at 101 iterations.    <br>   <a href="#fig05">Figure   5(b)</a>: Training for lighting located in the dinning room at 117 iterations.    <br>   <a href="#fig05">Figure   5(c)</a>: Training for the lighting located in the study room at 29 iterations.    <br>   <a href="#fig05">Figure   5(d)</a>: Training for the lighting located in room 1 at 31 iterations.    ]]></body>
<body><![CDATA[<br>   <a href="#fig05">Figure   5(e)</a>: Training for the lighting located in room 3 at 134 iterations. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>3.3 Analysis of the data obtained    <br>   </b>In <a href="#fig06">Figure 6</a> can be observed   the percentage participation graphics on the On and   Off states in the total of the data gathered for each lighting in a period of   11 weeks. It must be noted that the On state is made   of the 25%, 50%, 75% and 100% regulations and the Off state by 0%. &#91;8&#93; &#91;9&#93;.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="fig06"></a><img src="/img/revistas/dyna/v77n163/a26fig06.gif">    <br>   Figure 6.</b> Contribution of the states in the total of data gathered for each   lighting</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><a href="#fig06">Figure   6(a)</a>: Contribution of the states in the total of data gathered for the lighting   in the kitchen.    <br>   <a href="#fig06">Figure   6(b)</a>: Contribution of the states in the total of data gathered for the lighting   in the dinning room.    <br>   <a href="#fig06">Figure   6(c)</a>: Contribution of the states in the total of the data gathered in the   lighting of the study room.    <br>   <a href="#fig06">Figure   6(d)</a>: Contribution of the states in the total of the data gathered in the   lighting of room 1.    <br>   <a href="#fig06">Figure   6(e)</a>: Contribution of the states in the total of the data gathered in the   lighting of room 3.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>3.4 Result analysis for total saving    <br>   </b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Upon <a href="#fig06">Figure 6</a> and the exits of   the simulations for total savings, it was observed that exists a great variety   of data that is in intermediate points of a range between " 0" and " 1" taking in consideration that   the 2 states that make the total saving are given by On (1) and Off (0), is   implemented a logical condition that takes to Zero the close values to it and   the same way it takes to " 1"   the values close to one. To find the optimal logic value it was necessary to   develop a series of tests, calculating the cost of energy and user satisfaction   for 5 different conditions or comparison limit as shown in <a href="#tab11">Table 11</a>.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab11"></a>Table 11.</b> Selection of the logic or   limit condition</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab11.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">To obtain the vinculating   error directly with insatisfaction , it is determined   a factor that involves the user preference with respect to the performance of   the network. This factor is fixed through an interview made to the members of   the house where the data was gathered, as shown in <a href="#tab10">Table   10</a>. In this Table is found   the preference by user, giving a percentage average value by state, according   to this it is preferred in 75% that the network turns off the lights but not   that it turns them on.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab10"></a>Table   10.</b> Preference Survey</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab10.gif"></p> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">According to these two factors, we have that the absolute error for the  network is given by equation 3.</font>     <p><img src="/img/revistas/dyna/v77n163/a26eq021.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Where: EabsOn and EabsOff are   the average errors of the difference between the objective vector and the exit of the network.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="#tab12">Table 12</a> is found the final   cost for lighting that was generated by using the neural networks in the exit   of the total saving, also is found the real price for each   lighting that was calculated with the data gathered. The total cost of   energy, real as the one provided by the neural network is found on <a href="#tab13">Table 13</a>.</font></p>     ]]></body>
<body><![CDATA[<p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab12"></a>Table 12.</b> Cost of energy total saving for 100 W lighting</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab12.gif"></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab13"></a>Table 13.</b> Total Cost of Energy Total   Savings</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab13.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">By last for the analysis of the   data of the total saving, in <a href="#tab14">Table 14</a> is appreciated the costumer   insatisfaction that is obtained by multiplying the absolute error by the   average of the client preferences with respect to the On and Off states of the lightings found on <a href="#tab11">Table 11</a>.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab14"></a>Table 14.</b> Costumer satisfaction for total savings</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab14.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>3.5 Resulting analysis for the personalized saving    <br>   </b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">For the personalized saving as   for the total saving, it was defined a 0.2 limit "20%" , but different than the total saving, the limit was determined to filtrate the   noise produced by the network, which provided a considerable consumption. It   was also established a limit above 100%, due that it is not possible that the   lighting acquires a higher value. Because of this that data that exceeded were   approximated to such objective.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="#tab15">Table 15</a> is found the final   cost per lighting applied to the personalized saving.</font></p>     ]]></body>
<body><![CDATA[<p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab15"></a>Table 15.</b> Cost of energy personalized saving</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab15.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The total cost of energy of   the 5 lightings for regulation, real and given by the neural network is found   on <a href="#tab16">Table 16</a>.</font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab16"></a>Table 16.</b> Total cost of energy personalized saving</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab16.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><a href="#tab17">Table 17</a> is composed by 5   columns, where costumer insatisfaction can be found, which was determined by   multiplying the absolute error by the average percentages of preference of the   costumer in each seen state from <a href="#tab10">Table 10</a>. Finally it is appreciated the   costumer satisfaction data applied to the personalized saving. </font></p>     <p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab17"></a>Table 17.</b> Client satisfaction for   personalized saving</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab17.gif"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>3.6 Savings methods Comparison    <br>   </b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="#tab18">Table 18</a> is made a   comparison of the criteria "total saving and personalized saving" of the   scheduling programming as saving strategies of electric energy in illumination.</font></p>     ]]></body>
<body><![CDATA[<p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b><a name="tab18"></a>Table 18.</b> Comparison Total Saving and personalized Saving</font>    <br>   <img src="/img/revistas/dyna/v77n163/a26tab18.gif"></p>     <p>&nbsp;</p>     <p><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>4. CONCLUSIONS</b></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Due it was not possible to   perform the simulation for all the   lights it was designed a neural network for one lighting by a series of test /   errors, being implemented for the remaining four lightings and obtaining an   average performance of 0,0303577.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">It was implemented a   scheduling programming method using neural networks obtaining a satisfaction in   total savings of 87.2% and in personalized saving of 88.5%. Which generated in   the time period established energy costs for total savings of $57.886,3 and for personalized savings of $53.142,2.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">By comparing the criteria of   total saving and personalized saving with respect to the network exits it is   determined that in the total savings was a costumer satisfaction of 87.2%, in   comparison to the personalized saving which was 88.5%, giving a higher   satisfaction in the personalized saving of   1.3%.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In the same way it is realized   the comparison in costs of energy consumed with respect to the real values,   obtaining a saving of $13.503.9 for total saving and $8.009.3 for personalized   saving, obtaining a higher economic benefit in the personalized saving of   $4.744,1. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In the exits of the neural network developed are presented unnecessary   costs in the moments in which the light should be turned Off.   For the total saving is presented and expenditure of 17.138,3678 in the Off state and for the personalized saving of 19.779,24 when   the regulation should be zero per cent.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Even though the   data of the exit of the neural network is saving, it is important to take in   consideration that the neural network is not providing the required consumption   to supply the basic needs of residential lighting.</font></p>     ]]></body>
<body><![CDATA[<p>&nbsp;</p>     <p><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>REFERENCES</b></font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>&#91;1&#93;</b> Casa domo. "El portal del hogar digital" &#91;Online&#93;. &#91;ref. August 4 of 2008&#93;. Available: <a href="http://www.casadomo.com/noticiasDetalle.aspx?c=145&idm=157&m=21&n2=20&pat=20" target="referencia">http://www.casadomo.com/noticiasDetalle.aspx?c=145&amp;idm=157&amp;m=21&amp;n2=20&amp;pat=20</a>.     &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000195&pid=S0012-7353201000030002600001&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;2&#93;</b> B. MARTIN DEL BRIO, A. SAENZ MOLINA, Redes Neuronales y Sistemas Difusos. Alfaomega. Mexico , 2002, pp. XXI, 3,69.     &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000196&pid=S0012-7353201000030002600002&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;3&#93;</b> P. L. GALINDO. "Redes multicapa Algoritmo BackPropagation". &#91;Online&#93;. 1999. &#91;ref. December 5 of 2008&#93;. Available: <a href="http://www2.uca.es/dept/leng_sist_informaticos/preal/23041/transpas/EBackpropagation/ppframe.htm" target="referencia">http://www2.uca.es/dept/leng_sist_informaticos/preal/23041/transpas/EBackpropagation/ppframe.htm</a> Slide 6.     &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000197&pid=S0012-7353201000030002600003&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;4&#93;</b> Chapter 4: Redes Neuronales en la predicci&oacute;n. &#91;Online&#93;. &#91;ref. February 3 of 2009&#93;. Available: <a href="http://catarina.udlap.mx/u_dl_a/tales/documentos/msp/aguilar_d_ra/capitulo4.pdf" target="referencia">http://catarina.udlap.mx/u_dl_a/tales/documentos/msp/aguilar_d_ra/capitulo4.pdf</a>.     &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000198&pid=S0012-7353201000030002600004&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;5&#93;</b> "Mecanismos de Adaptaci&oacute;n de par&aacute;metros". &#91;Online&#93;. &#91;ref. March 18 of 2009&#93;. Available: <a href="http://omarsanchez.net/adaptparam.Aspx" target="referencia">http://omarsanchez.net/adaptparam.Aspx</a>.     &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000199&pid=S0012-7353201000030002600005&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;6&#93;</b> M. C. R&Iacute;OS, N. C. HERN&Aacute;NDEZ, M. M. CHANCHAN. "Evaluaci&oacute;n de los diferentes algoritmos de entrenamiento de redes neuronales artificiales para el problema de clasificaci&oacute;n vehicular. &#91;Online&#93;. Mexico. &#91;ref. June 23 of 2009&#93;. Available: <a href="http://yalma.fime.uanl.mx/~pisis/Verano/2006/talk-norma.pdf" target="referencia">http://yalma.fime.uanl.mx/~pisis/Verano/2006/talk-norma.pdf</a>.     &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000200&pid=S0012-7353201000030002600006&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;7&#93;</b> Neural Network ToolboxT 6. &#91;Online&#93;. &#91;ref. March 6 of 2008&#93;. Available: <a href="http://www.mathworks.com/access/helpdesk/help/pdf_doc/nnet/nnet.pdf" target="referencia">http://www.mathworks.com/access/helpdesk/help/pdf_doc/nnet/nnet.pdf</a>.     &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000201&pid=S0012-7353201000030002600007&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;8&#93;</b> I. Richardson, M. Thomson, D. Infield, A. Delahunty, "Domestic lighting: A high-resolution energy demand model". &#91;Online&#93;. Inglaterra. 2008. &#91;ref. March 23 of 2009&#93;. Available: <a href="http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6V2V-4VSB18H-5&_user=10&_coverDate=07/31/2009&_alid=1373133729&_rdoc=12&_fmt=high&_orig=search&_cdi=5712&_sort=r&_docanchor=&view=c&_ct=4973&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=bcba6dbfbfd6ef242fd2a8348e7c53ef" target="referencia">http://www.sciencedirect.com/science?_ob=ArticleURL&amp;_udi=B6V2V-4VSB18H-5&amp;_user=10&amp;_coverDate=07%2F31%2F2009&amp;_alid=1373133729&amp;_rdoc=12&amp;_fmt=high&amp;_orig=search&amp;_cdi=5712&amp;_sort=r&amp;_docanchor=&amp;view=c&amp;_ct=4973&amp;_acct=C000050221&amp;_version=1&amp;_urlVersion=0&amp;_userid=10&amp;md5=bcba6dbfbfd6ef242fd2a8348e7c53ef</a>    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000202&pid=S0012-7353201000030002600008&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><!-- ref --><br>   <b>&#91;9&#93;</b> H.W. Li , K.L. Cheung, S.L. Wong, N.T. Lam. "An analysis of energy-efficient light fittings and lighting controls". &#91;Online&#93;. China, Research Group, City University of Hong Kong, Tat Chee Avenue. 2009 &#91;ref. August 23 of 2009&#93;. Available: <a href="http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6V1T-4WW16YF-1&_user=10&_coverDate=02/28/2010&_alid=1373151754&_rdoc=24&_fmt=high&_orig=search&_cdi=5683&_sort=r&_docanchor=&view=c&_ct=4973&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=e84d9bb9acdf97cbd15714b13cc27301" target="referencia">http://www.sciencedirect.com/science?_ob=ArticleURL&amp;_udi=B6V1T-4WW16YF-1&amp;_user=10&amp;_coverDate=02%2F28%2F2010&amp;_alid=1373151754&amp;_rdoc=24&amp;_fmt=high&amp;_orig=search&amp;_cdi=5683&amp;_sort=r&amp;_docanchor=&amp;view=c&amp;_ct=4973&amp;_acct=C000050221&amp;_version=1&amp;_urlVersion=0&amp;_userid=10&amp;md5=e84d9bb9acdf97cbd15714b13cc27301</a>. </font>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=000203&pid=S0012-7353201000030002600009&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --> ]]></body><back>
<ref-list>
<ref id="B1">
<label>1</label><nlm-citation citation-type="">
<source><![CDATA[Casa domo: El portal del hogar digital]]></source>
<year></year>
</nlm-citation>
</ref>
<ref id="B2">
<label>2</label><nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[MARTIN DEL BRIO]]></surname>
<given-names><![CDATA[B.]]></given-names>
</name>
<name>
<surname><![CDATA[SAENZ MOLINA]]></surname>
<given-names><![CDATA[A.]]></given-names>
</name>
</person-group>
<source><![CDATA[Redes Neuronales y Sistemas Difusos]]></source>
<year>2002</year>
<publisher-loc><![CDATA[Mexico ]]></publisher-loc>
<publisher-name><![CDATA[Alfaomega]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B3">
<label>3</label><nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[GALINDO]]></surname>
<given-names><![CDATA[P. L.]]></given-names>
</name>
</person-group>
<source><![CDATA[Redes multicapa Algoritmo BackPropagation]]></source>
<year>1999</year>
</nlm-citation>
</ref>
<ref id="B4">
<label>4</label><nlm-citation citation-type="">
<source><![CDATA[Chapter 4: Redes Neuronales en la predicción]]></source>
<year></year>
</nlm-citation>
</ref>
<ref id="B5">
<label>5</label><nlm-citation citation-type="">
<source><![CDATA[Mecanismos de Adaptación de parámetros]]></source>
<year></year>
</nlm-citation>
</ref>
<ref id="B6">
<label>6</label><nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[RÍOS]]></surname>
<given-names><![CDATA[M. C.]]></given-names>
</name>
<name>
<surname><![CDATA[HERNÁNDEZ]]></surname>
<given-names><![CDATA[N. C.]]></given-names>
</name>
<name>
<surname><![CDATA[CHANCHAN]]></surname>
<given-names><![CDATA[M. M.]]></given-names>
</name>
</person-group>
<source><![CDATA[Evaluación de los diferentes algoritmos de entrenamiento de redes neuronales artificiales para el problema de clasificación vehicular]]></source>
<year></year>
</nlm-citation>
</ref>
<ref id="B7">
<label>7</label><nlm-citation citation-type="">
<source><![CDATA[Neural Network ToolboxT 6]]></source>
<year></year>
</nlm-citation>
</ref>
<ref id="B8">
<label>8</label><nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Richardson]]></surname>
<given-names><![CDATA[I.]]></given-names>
</name>
<name>
<surname><![CDATA[Thomson]]></surname>
<given-names><![CDATA[M.]]></given-names>
</name>
<name>
<surname><![CDATA[Infield]]></surname>
<given-names><![CDATA[D.]]></given-names>
</name>
<name>
<surname><![CDATA[Delahunty]]></surname>
<given-names><![CDATA[A.]]></given-names>
</name>
</person-group>
<source><![CDATA[Domestic lighting: A high-resolution energy demand model]]></source>
<year></year>
</nlm-citation>
</ref>
<ref id="B9">
<label>9</label><nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Li]]></surname>
<given-names><![CDATA[H.W.]]></given-names>
</name>
<name>
<surname><![CDATA[Cheung]]></surname>
<given-names><![CDATA[K.L.]]></given-names>
</name>
<name>
<surname><![CDATA[Wong]]></surname>
<given-names><![CDATA[S.L.]]></given-names>
</name>
<name>
<surname><![CDATA[Lam]]></surname>
<given-names><![CDATA[N.T.]]></given-names>
</name>
</person-group>
<source><![CDATA[An analysis of energy-efficient light fittings and lighting controls]]></source>
<year>2009</year>
<publisher-name><![CDATA[Research Group, City University of Hong Kong]]></publisher-name>
</nlm-citation>
</ref>
</ref-list>
</back>
</article>
