BCIT Citations Collection | BCIT Institutional Repository

BCIT Citations Collection

Pages

A performance comparison between FRC and WWM reinforced slabs on grade
Proceedings of 4th International Conference on Structural Health Monitoring of Intelligent Infrastructure (SHMII-4) 2009, 22-24 July 2009, Zurich, Switzerland. A comparative experimental study was conducted to investigate the effectiveness of fiber reinforcement as a non-corrosive alternative for welded-wire reinforcement in slabs on grade. Six full-scale slabs-on-grade, reinforced with various combinations of WWM (Welded Wire Mesh), polymeric macro-synthetic fibers (PMF) and cellulose fibers were tested under a centrally concentrated load. Their ductility and load carrying capacity were evaluated and compared. Based on the results of this study, it seems that high dosages of polymeric macrofibers can be used to successfully reinforce concrete slabs. Given that the use of PMF eliminates the possibility of corrosion of reinforcement, this may be a superior option. Furthermore, it seems low dosages of fibers act as an ineffective replacement for WWM. Low dosages of PMF and cellulose fiber when added on their own, or in combination with each other were found to be insufficient in providing sufficient ductility or load carrying capacity compared to the control slab when subjected to the load test. Slabs reinforced with cellulose fiber had a poor mechanical response in comparison to WWM and therefore cellulose fiber on its own is not recommended., Conference paper, Published.
Performance-risk analysis for the design of high-performance affordable homes
Proceedings of the 3rd Building Enclosure Science & Technology (BEST3) Conference, Atlanta, USA, April 2-4, 2014. Net-zero energy, emissions, and carbon sustainability targets for buildings are becoming achievable with the use of renewable energy technologies and high-performance construction, equipment, and appliances. Methodologies and tools have also been developed and tested to help design teams search for viable strategies for net-zero buildings during the early stages of design. However, the risks for underperformance of high-performance technologies, systems, and whole buildings are usually not assessed methodically. The negative consequences have been, often reluctantly, reported. This paper presents a methodology for explicitly considering and assessing underperformance risks during the design of high-performance buildings. The methodology is a first attempt to formalize extensive applied research and industry experiences in the quest for net-zero energy homes in the U.S., and build on existing tools and methods from performance-based design, as well as optimization, decision, and risk analysis. The methodology is knowledge driven and iterative in order to facilitate new knowledge acquired to be incorporated in the decision making. As a point of departure in the process, a clear definition of the project vision and a two-level organization of the corresponding building function performance objectives are laid out, with objectives further elaborated into high-performance targets and viable alternatives selected from the knowledge-base to meet these. Then, a knowledge guided search for optimized design strategies to meet the performance targets takes place, followed by a selection of optimized strategies to meet the objectives and the identification of associated risks from the knowledge-base. These risks are then evaluated, leading either to mitigation strategies or to changing targets and alternatives, and feeding back to the knowledge-base. A case study of affordable homes in hot humid climate is used to test the methodology and demonstrate its application. The case study clearly illustrates the advantages of using the methodology to minimize under performance risks. Further work will follow to develop the underpinning mathematical formalisms of the knowledge base and the risk evaluation procedure., Conference paper, Published.
A pilot scale comparison of the effects of chemical pre-treatments of wood chips on the properties of low consistency refined TMP
Proceedings of the International Mechanical Pulping Conference 2016, IMPC 2016. After decades of research and development, the technology of thermomechanical pulping (TMP) has dramatically improved resulting in higher pulp quality, especially strength. However, the TMP industry is still faced with the challenge of continually increasing energy costs. One approach to reducing the energy costs is to replace the second-stage high consistency (HC) refiner with several low consistency (LC) refiners. This is based on the observation that low consistency refining is more energy efficient than high consistency refining. The limitation of LC refining is loss of paper strength due to the high frequency of fibre cutting especially at high refining intensity. Chemical treatment combined with low consistency refining provides opportunity for even further energy savings. The chemical treatment could improve pulp properties allowing for further energy reduction in the HC refining stage or reduced intensity during LC refining resulting in less fibre cutting. Indeed, it is also possible that the chemical treatment itself will improve the resistance of the fibre to the cutting during LC refining., Conference paper, Published.
Predicting academic performance
Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education. The ability to predict student performance in a course or program creates opportunities to improve educational outcomes. With effective performance prediction approaches, instructors can allocate resources and instruction more accurately. Research in this area seeks to identify features that can be used to make predictions, to identify algorithms that can improve predictions, and to quantify aspects of student performance. Moreover, research in predicting student performance seeks to determine interrelated features and to identify the underlying reasons why certain features work better than others. This working group report presents a systematic literature review of work in the area of predicting student performance. Our analysis shows a clearly increasing amount of research in this area, as well as an increasing variety of techniques used. At the same time, the review uncovered a number of issues with research quality that drives a need for the community to provide more detailed reporting of methods and results and to increase efforts to validate and replicate work., Peer reviewed, Conference paper, Published.
Predictive algorithm for Volt/VAR optimization of distribution networks using Neural Networks
Proceedings of IEEE Canadian Conference on Electrical and Computer Engineering (CCECE2014),May 2014, Toronto, Canada. Smart Grid functions such as Advanced Metering Infrastructure, Pervasive Control and Distribution Management Systems have brought numerous control and optimization opportunities for distribution networks through more accurate and reliable techniques. This paper presents a new predictive approach for Volt/VAr Optimization (VVO) of smart distribution systems using Neural Networks (NN) and Genetic Algorithm (GA). The proposed predictive algorithm is capable of predicting the load profile of target nodes a day ahead by employing the historical metrology data of Smart Meters, It can further perform a comprehensive VVO in order to minimize distribution network loss/operating costs and run Conservation Voltage Reduction (CVR) to conserve more energy. To test the merits of the proposed algorithm, British Columbia Institute of Technology north campus distribution grid is used as research case study., Conference paper, Published.
Preliminary results from field experimental study of rain load and penetration into wood-frame wall systems at window sill defects
14th Canadian Conference on Building Science and Technology, Toronto, Canada, October 29th-30th, 2014. A field study is presented here on the investigation of the correlation between wind-driven rain (WDR) as the driving force and the relative proportions of water penetration at intended defects (openings) located at the interface of windows and exterior walls. In this field study, eight full-scale exterior-wall panels of vinyl siding and stucco claddings were built and installed on a field testing station, which is subjected to British Columbia’s west coast climate rain. This paper focuses on the preliminary results from one of the stucco wall panels with a discontinuity in the sealant around the perimeters of the windows. The water passing through this defect was collected and measured. The instantaneous and automatic water collection measurements were synchronized to the data gathered by a nearby weather station on wind-driven rain intensity, wind speed and direction. In addition, rain gauges on exterior of walls collected the wind-driven rain against each façade of the test station. Compared to previous computer simulations and laboratory experimental studies on rain penetration through exterior walls, this study was conducted under more realistic conditions. The panels were subjected to real wind-driven rain events. Also collectively, the experiment took into account rain that splashed off the wall façade upon impact and the rain water around the defect location due to run-off. The study is ongoing. However, when complete, the results from this study will be useful for fine-tuning the principal moisture load that is applied in hygrothermal performance assessment and design of exterior wall systems., Conference paper, Published.
Program Comprehension: Identifying Learning Trajectories for Novice Programmers
This working group asserts that Program Comprehension (PC) plays a critical part in the writing process. For example, this abstract is written from a basic draft that we have edited and revised until it clearly presents our idea. Similarly, a program is written in an incremental manner, with each step being tested, debugged and extended until the program achieves its goal. Novice programmers should develop their program comprehension as they learn to code, so that they are able to read and reason about code while they are writing it. To foster such competencies our group has identified two main goals: (1) to collect and define learning activities that explicitly cover key components of program comprehension and (2) to define possible learning trajectories that will guide teachers using those learning activities in their CS0/CS1 or K-12 courses. We plan to achieve these goals as follows: Step 1 Review the current state of research and development by analyzing literature on classroom activities that improve program comprehension. Step 2 Concurrently, survey lecturers at various institutions on their use of workshop activities to foster PC. Step 3 Use the outputs from both activities to define and conceptualize what is meant by PC in the context of novice programmers. Step 4 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 5 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 6 Develop a map of learning activities and thereby also models of probable learning trajectories., Not peer reviewed, Conference proceedings
Program comprehension: identifying learning trajectories for novice programmers
This working group asserts that Program Comprehension (PC) plays a critical part in the writing process. For example, this abstract is written from a basic draft that we have edited and revised until it clearly presents our idea. Similarly, a program is written in an incremental manner, with each step being tested, debugged and extended until the program achieves its goal. Novice programmers should develop their program comprehension as they learn to code, so that they are able to read and reason about code while they are writing it. To foster such competencies our group has identified two main goals: (1) to collect and define learning activities that explicitly cover key components of program comprehension and (2) to define possible learning trajectories that will guide teachers using those learning activities in their CS0/CS1 or K-12 courses. We plan to achieve these goals as follows: Step 1 Review the current state of research and development by analyzing literature on classroom activities that improve program comprehension. Step 2 Concurrently, survey lecturers at various institutions on their use of workshop activities to foster PC. Step 3 Use the outputs from both activities to define and conceptualize what is meant by PC in the context of novice programmers. Step 4 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 5 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 6 Develop a map of learning activities and thereby also models of probable learning trajectories., Not peer reviewed, Conference proceedings
Quantifying the effects of on-the-fly changes of seating configuration on the stability of a manual wheelchair
Proceedings of the 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), in Seogwipo, South Korea, 11-15 July 2017. In general, manual wheelchairs are designed with a fixed frame, which is not optimal for every situation. Adjustable on the fly seating allow users to rapidly adapt their wheelchair configuration to suit different tasks. These changes move the center of gravity (CoG) of the system, altering the wheelchair stability and maneuverability. To assess these changes, a computer simulation of a manual wheelchair was created with adjustable seat, backrest, rear axle position and user position, and validated with experimental testing. The stability of the wheelchair was most affected by the position of the rear axle, but adjustments to the backrest and seat angles also result in stability improvements that could be used when wheeling in the community. These findings describe the most influential parameters for wheelchair stability and maneuverability, as well as provide quantitative guidelines for the use of manual wheelchairs with on the fly adjustable seats., Conference paper, Published.
Ranking functions for belief change
Proceedings of the 6th International Conference on Agents and Artificial Intelligence in Angers, France, 2014. In this paper, we explore the use of ranking functions in reasoning about belief change. It is well-known that the semantics of belief revision can be defined either through total pre-orders or through ranking functions over states. While both approaches have similar expressive power with respect to single-shot belief revision, we argue that ranking functions provide distinct advantages at both the theoretical level and the practical level, particularly when actions are introduced. We demonstrate that belief revision induces a natural algebra over ranking functions, which treats belief states and observations in the same manner. When we introduce belief progression due to actions, we show that many natural domains can be easily represented with suitable ranking functions. Our formal framework uses ranking functions to represent belief revision and belief progression in a uniform manner; we demonstrate the power of our approach through formal results, as well as a series of natural problems in commonsense reasoning., Conference paper, Published.
Real-time adaptive optimization engine algorithm for integrated Volt/VAr optimization and conservation voltage reduction of smart microgrids
Proceedings from CIGRÉ Canada Conference, Montreal, Sept. 2012. In recent decade, smart microgrids have raised the feasibility and affordability of adaptive and real-time Volt/VAr optimization (VVO) and Conservation Voltage Reduction (CVR) implementations by their exclusive features such as using smart metering technologies and various types of dispersed generations. Smart distribution networks are presently capable of achieving higher degrees of efficiency and reliability through employing a new integrated Volt/VAr optimization system. For VVO application, two well-known approaches are recommended by different utilities and/or companies: Centralized VVO and Decentralized VVO. In centralized VVO, the processing system is placed in a central controller unit such as DMS in the so called “Utility Back Office”. The DMS uses relevant measurements taken from termination points (i.e. utility subscribers) supplied to it from either field collectors or directly from MDMS, to determine the best possible settings for field-bound VVO/CVR assets to achieve the desired optimization and conservation targets. These settings are then off-loaded to such assets through existing downstream pipes, such as SCADA network In contrast, decentralized VVO utilizes VVO/CVR engines which are located in the field and in close-proximity to the relevant assets to conserve voltage and energy according to local attributes of the distribution network. In this case, local measurements do not need to travel from the field to the back-office, and the new settings for VVO/CVR assets are determined locally, rather than from a centralized controller. Without having any preference between above mentioned VVO techniques, this paper studies an adaptive optimization engine for real-time VVO/CVR in smart microgrids based on Intelligent Agent technology. The optimization algorithm provides the best optimal solution for VVO/CVR problem at each real-time stage through minimizing system loss cost and improves system energy efficiency as well as voltage profile of the relevant distribution system. The algorithm may employ distributed generation sources to address the Volt/VAr optimization problem in real-time. Coordinated VVO/CVR requires real-time data analysis every 15 minutes. It utilizes a distributed command and control architecture to supply the VVO Engine (VVOE) with the required data, and secures real-time configuration from the VVO engine for the VVO control devices such as On-Load Tap Changers (OLTCs), Voltage Regulators (VRs) and Capacitor Banks (CBs). It also has the option of employing distributed generation (DG) as well as modelling load effects in VVO/CVR application. The algorithm minimizes the distribution network power loss cost at each time stage, checks the voltage deviation of distribution buses and distributed generation sources considering different types of constraints such as system power flow, distribution network power factor, system active and reactive power constraints and switching limitations of Volt/VAr control devices. The algorithm receives required real-time data from an intelligent agent. Then, it starts to solve the real-time VVO/CVR problem in order to find the best optimal configuration of the network in real-time. The paper uses British Columbia Institute of Technology (BCIT) distribution network as its case study in order to explore the effectiveness and the accuracy of the optimization engine. Moreover, the VVO/CVR optimization algorithm is implemented in different configurations; a) VVO/CVR confined to the substation and b) VVO/CVR optimization algorithm within the substation and along distribution feeders. The algorithm also checks the availability of DGs to assist VVO/CVR control functions and assesses the impact of new distributed sources such as: Flywheel Energy Storage System (FESS) on real-time VVO/CVR. For this reason, the algorithm classified DGs in a microgrid based on their impacts and instantiates them based on their application feasibility for real-time VVO/CVR., Conference paper, Published.
Real-time co-simulated platform for novel Volt-VAR Optimization of smart distribution network using AMI data
Accepted in IEEE International Conference on Smart Energy Grid Engineering, May 2015. This paper presents a real-time co-simulated platform for novel voltage and reactive power optimization (VVO) of distribution grids through a real-time digital simulator (RTDS) in presence of a reliable communication platform. The proposed VVO engine is able to capture quasi real-time data from local Advanced Metering Infrastructure (AMI) and optimizes the distribution network for each quasi real-time stage (every 5 minutes) based on system main characteristics (i.e. active/reactive power of nodes). At each time stage, the VVO engine tries to minimize losses in the distribution network as well as to improve the voltage profile of the system. In order to test robustness, performance and the applicability of proposed Volt-VAR Optimization engine, a 33 node distribution network has been modeled and studied in a real-time Co-simulated environment by real-time simulator (RTDS) and a real communication platform with DNP.3 protocol. The preliminary results prove well-performance of proposed AMI-based VVO engine and show that the engine enables system to achieve higher level of loss/operating cost reduction through a sophisticated optimization engine compare with conventional approaches., Conference paper, Published.
Real-time communication platform for Smart Grid adaptive Volt-VAR Optimization of distribution networks
Proceeding of IEEE International Conference on Smart Energy Grid Engineering (SEGE), Aug. 2015, Oshawa, ON, Canada. This paper investigates a real-time communication platform for a Smart Grid adaptive Volt-VAR Optimization (VVO) engine. Novel VVO techniques receive inputs from Advanced Metering Infrastructure (AMI) to dynamically optimize distribution networks. As communication platform design and characteristics affect Smart Grid-based VVO performance in terms of control accuracy and response time, VVO ICT studies is essential for grid planners and/or power utilities. Hence, this paper primarily introduces a real-time co-simulated environment comprised of Smart Grid adaptive VVO engine, RTDS model and system communication platform using DNP3 protocol. This platform is built to test and asses the influence of different components included in Smart Grid monitoring and control system; namely the sensors, measurement units, communication infrastructure on the operation and control of VVO. Moreover, this paper uses a real-time platform to check the robustness of the monitoring and control applications for communication network considerations such as delays and packet loss. Next, this paper investigates how such a platform could look into communication issues while taking system requirements into consideration. A 33-node distribution feeder is employed to check system performance through communication parameters such as throughput and response time., Conference paper, Published.
Recruiting new genes in evolving genetic networks
Proceedings of the World Congress on Engineering and Computer Science 2007 WCECS 2007, October 24-26, 2007, San Francisco, USA. Gene recruitment or co-option is defined as the placement of a gene under a foreign regulatory system. Such re-arrangement of pre-existing regulatory networks can lead to an increase in genomic complexity. This reorganization is recognized as a major driving force in evolution. We simulated the evolution of gene networks by means of the Genetic Algorithms (GA) technique. We used standard GA methods of (point) mutation and multi-point crossover, as well as our own operators for introducing or withdrawing new genes on the network. The starting point for our computer evolutionary experiments was a minimal 4-gene dynamic model representing the real genetic network controlling segmentation in the fruit fly Drosophila. Model output was fit to experimentally observed gene expression patterns in the early fly embryo. We found that the mutation operator, together with the gene introduction procedure, was sufficient for recruiting new genes into pre-existing networks. Reinforcement of the evolutionary search by crossover operators facilitates this recruitment. Gene recruitment causes outgrowth of an evolving network, resulting in structural and functional redundancy. Such redundancies can affect the robustness and evolvability of networks., Conference paper, Published.
Retroviral genetic algorithms
Proceedings of the 2011 International Conference on Evolutionary Computation Theory and Applications. Classical understandings of biological evolution inspired creation of the entire order of Evolutionary Computation (EC) heuristic optimization techniques. In turn, the development of EC has shown how living organisms use biomolecular implementations of these techniques to solve particular problems in survival and adaptation. An example of such a natural Genetic Algorithm (GA) is the way in which a higher organism's adaptive immune system selects antibodies and competes against its complement, the development of antigen variability by pathogenic organisms. In our approach, we use operators that implement the reproduction and diversification of genetic material in a manner inspired by retroviral reproduction and a genetic-engineering technique known as DNA shuffling. We call this approach Retroviral Genetic Algorithms, or retroGA (Spirov and Holloway, 2010). Here, we extend retroGA to include: (1) the utilization of tags in strings; (2) the capability of the Reproduction-Crossover operator to read these tags and interpret them as instructions; and (3), as a consequence, to use more than one reproductive strategy. We validated the efficacy of the extended retroGA technique with benchmark tests on concatenated trap functions and compared these with Royal Road and Royal Staircase functions., Conference paper, Published.
A scoping review of data logger technologies used with manual wheelchairs
Proceedings of 2015 RESNA Annual Conference. In recent years, more and more studies are using data logger technologies to document driving and physiological characteristics of manual wheelchair users. However, the technologies used offer marked differences in characteristics such as measured outcomes, ease of use, burden, etc. The objective of this study is to examine the extent of research activity that relied on data logger technologies for manual wheelchair users. We undertook a scoping review of the scientific and gray literature. Five databases were searched from January 1979 to November 2014: Medline, Compendex, CINAHL, EMBASE and Google Scholar. This review retained 104 papers. The selected papers document a wide variety of systems and technologies, measuring a whole range of outcomes. Of all technologies combined, 16.8% were accelerometers installed on the user, 14.8% were magnetic odometers or odometers installed on the wheelchair, 10.2% were accelerometers installed on the wheelchair and 8.67% were heart monitors. So, it is not surprising that the most reported outcomes were distance, speed and acceleration of the wheelchair, and heart rate. In the future, it may be necessary to reach a consensus on what outcomes are important to measure and how. Technological improvements and access to less expensive devices will probably make it possible to easily measure many important outcomes at relatively low cost., Conference paper, Published.

Pages