BCIT Citations Collection | BCIT Institutional Repository

BCIT Citations Collection

Pages

A novel Volt-VAR Optimization engine for smart distribution networks utilizing Vehicle to Grid dispatch
In recent years, Smart Grid technologies such as Advanced Metering, Pervasive Control, Automation and Distribution Management have created numerous control and optimization opportunities and challenges for smart distribution networks. Availability of Co-Gen loads and/or Electric Vehicles (EVs) enable these technologies to inject reactive power into the grid by changing their inverter’s operating mode without considerable impact on their active power operation. This feature has created considerable opportunity for distribution network planners to explore if EVs could be used in the distribution network as reliable VAR suppliers. It may be possible for network operators to employ some EVs as VAR suppliers for future distribution grids. This paper proposes an innovative Smart Grid-based Volt-VAR Optimization (VVO) engine, capable of minimizing system power loss cost as well as the operating cost of switched Capacitor Banks, while optimizing the system voltage using an improved Genetic Algorithm (GA) with two levels of mutation and two levels of crossover. The paper studies the impact of EVs with different charging and penetration levels on VVO in different operating scenarios. Furthermore, the paper demonstrates how a typical VVO engine could benefit from V2G’s reactive power support. In order to assess V2G impacts on VVO and test the applicability of the proposed VVO, revised IEEE-123 Node Test Feeder in presence of various load types is used as case study., Article, Published. Received 24 May 2014, Revised 23 July 2015, Accepted 29 July 2015, Available online 8 August 2015.
NRC-IRC develops evaluation protocol for innovative vapour barrier
Vapour barriers were originally intended to keep building assemblies from getting wet, but they can sometimes end up preventing assemblies from drying out. An innovative new product to manage moisture accumulation in the building envelope, however, may be able to address both issues: while the product acts as a vapour barrier under most conditions, it also allows excess moisture to escape. The Canadian Construction Materials Centre (CCMC) set out to determine whether this product can serve as a vapour barrier and an air barrier system and whether it conformed to the intent of applicable building code requirements. In collaboration with NRC-IRC researchers, CCMC developed a testing protocol for its evaluation, which was based on laboratory testing requirements for vapour diffusion, air leakage control and durability., Article, Published.
On keeping secrets
Proceedings of the 2015 Workshops at the Twenty-Ninth AAAI Conference on Artificial Intelligence in Austin, USA, 2015. Communication involves transferring information from one agent to another. An intelligent agent, either human or machine, is often able to choose to hide information in order to protect their interests. The notion of information hiding is closely linked to secrecy and dishonesty, but it also plays an important role in domains such as software engineering. In this paper, we consider the ethics of information hiding, particularly with respect to intelligent agents. In other words, we are concerned with situations that involve a human and an intelligent agent with access to different information. Is the intelligent agent justified in preventing a human user from accessing the information that they possess? This is trivially true in the case where access control systems exist. However, we are concerned with the situation where an intelligent agent is able to using a reasoning system to decide not to share information with all humans. On the other hand, we are also concerned with situations where humans hide information from machines. Are we ever under a moral obligation to share information with a computional agent? We argue that questions of this form are increasingly important now, as people are increasingly willing to divulge private information to machines with a great capacity to reason with that information and share it with others., Conference paper, Published.
On the representation and verification of cryptographic protocols in a theory of action
Proceedings of 2010 Eighth Annual International Conference on Privacy Security and Trust (PST) in Ottawa, ON, Canada, 17-19 Aug. 2010. Cryptographic protocols are usually specified in an informal, ad hoc language, with crucial elements, such as the protocol goal, left implicit. We suggest that this is one reason that such protocols are difficult to analyse, and are subject to subtle and nonintuitive attacks. We present an approach for formalising and analysing cryptographic protocols in a theory of action, specifically the situation calculus. Our thesis is that all aspects of a protocol must be explicitly specified. We provide a declarative specification of underlying assumptions and capabilities in the situation calculus. A protocol is translated into a sequence of actions to be executed by the principals, and a successful attack is an executable plan by an intruder that compromises the specified goal. Our prototype verification software takes a protocol specification, translates it into a high-level situation calculus (Golog) program, and outputs any attacks that can be found. We describe the structure and operation of our prototype software, and discuss performance issues., Conference paper, Published.
Optimal scaling of weight and waist circumference to height for adiposity and cardiovascular disease risk in individuals with spinal cord injury
Study Design: Observational cross-sectional study. Objectives: Body mass index (BMI), measured as a ratio of weight (Wt) to the square of height (Wt/Ht(2)), waist circumference (WC) and waist-to-height ratio (WHtR) are common surrogate measures of adiposity. It is not known whether alternate scaling powers for height might improve the relationships between these measures and indices of obesity or cardiovascular disease (CVD) risk in individuals with spinal cord injury (SCI). We aimed to estimate the values of 'x' that render Wt/Ht(x) and WC/Ht(x) maximally correlated with dual energy x-ray absorptiometry (DEXA) total and abdominal body fat and Framingham Cardiovascular Risk Scores. Setting: Canadian public research institution. Methods: We studied 27 subjects with traumatic SCI. Height, Wt and body fat measurements were determined from DEXA whole-body scans. WC measurements were also obtained, and individual Framingham Risk Scores were calculated. For values of 'x' ranging from 0.0 to 4.0, in increments of 0.1, correlations between Wt/Ht(x) and WC/Ht(x) with total and abdominal body fat (kg and percentages) and Framingham Risk Scores were computed. Results: We found that BMI was a poor predictor of CVD risk, regardless of the scaling factor. Moreover, BMI was strongly correlated with measures of obesity, and modification of the scaling factor from the standard (Wt/Ht(2)) is not recommended. WC was strongly correlated with both CVD risk and obesity, and standard measures (WC and WHtR) are of equal predictive power. Conclusion: On the basis of our findings from this sample, alterations in scaling powers may not be necessary in individuals with SCI; however, these findings should be validated in a larger cohort., Peer-reviewed article, Published. Received 25 February 2014; revised 1 May 2014; accepted 1 August 2014; published online 30 September 2014.
Ordinal conditional functions for nearly counterfactual revision
Proceedings of the 16th International Workshop on Non-Monotonic Reasoning (NMR 2016), Cape Town, South Africa; April 22-24, 2016. We are interested in belief revision involving conditional statements where the antecedent is almost certainly false. To represent such problems, we use Ordinal Conditional Functions that may take infinite values. We model belief change in this context through simple arithmetical operations that allow us to capture the intuition that certain antecedents can not be validated by any number of observations. We frame our approach as a form of finite belief improvement, and we propose a model of conditional belief revision in which only the "right" hypothetical levels of implausibility are revised., Conference paper, Published.
The path of the smart grid
Exciting yet challenging times lie ahead. The electrical power industry is undergoing rapid change. The rising cost of energy, the mass electrification of everyday life, and climate change are the major drivers that will determine the speed at which such transformations will occur. Regardless of how quickly various utilities embrace smart grid concepts, technologies, and systems, they all agree onthe inevitability of this massive transformation. It is a move that will not only affect their business processes but also their organization and technologies., Final article published
A performance comparison between FRC and WWM reinforced slabs on grade
Proceedings of 4th International Conference on Structural Health Monitoring of Intelligent Infrastructure (SHMII-4) 2009, 22-24 July 2009, Zurich, Switzerland. A comparative experimental study was conducted to investigate the effectiveness of fiber reinforcement as a non-corrosive alternative for welded-wire reinforcement in slabs on grade. Six full-scale slabs-on-grade, reinforced with various combinations of WWM (Welded Wire Mesh), polymeric macro-synthetic fibers (PMF) and cellulose fibers were tested under a centrally concentrated load. Their ductility and load carrying capacity were evaluated and compared. Based on the results of this study, it seems that high dosages of polymeric macrofibers can be used to successfully reinforce concrete slabs. Given that the use of PMF eliminates the possibility of corrosion of reinforcement, this may be a superior option. Furthermore, it seems low dosages of fibers act as an ineffective replacement for WWM. Low dosages of PMF and cellulose fiber when added on their own, or in combination with each other were found to be insufficient in providing sufficient ductility or load carrying capacity compared to the control slab when subjected to the load test. Slabs reinforced with cellulose fiber had a poor mechanical response in comparison to WWM and therefore cellulose fiber on its own is not recommended., Conference paper, Published.
Performance-risk analysis for the design of high-performance affordable homes
Proceedings of the 3rd Building Enclosure Science & Technology (BEST3) Conference, Atlanta, USA, April 2-4, 2014. Net-zero energy, emissions, and carbon sustainability targets for buildings are becoming achievable with the use of renewable energy technologies and high-performance construction, equipment, and appliances. Methodologies and tools have also been developed and tested to help design teams search for viable strategies for net-zero buildings during the early stages of design. However, the risks for underperformance of high-performance technologies, systems, and whole buildings are usually not assessed methodically. The negative consequences have been, often reluctantly, reported. This paper presents a methodology for explicitly considering and assessing underperformance risks during the design of high-performance buildings. The methodology is a first attempt to formalize extensive applied research and industry experiences in the quest for net-zero energy homes in the U.S., and build on existing tools and methods from performance-based design, as well as optimization, decision, and risk analysis. The methodology is knowledge driven and iterative in order to facilitate new knowledge acquired to be incorporated in the decision making. As a point of departure in the process, a clear definition of the project vision and a two-level organization of the corresponding building function performance objectives are laid out, with objectives further elaborated into high-performance targets and viable alternatives selected from the knowledge-base to meet these. Then, a knowledge guided search for optimized design strategies to meet the performance targets takes place, followed by a selection of optimized strategies to meet the objectives and the identification of associated risks from the knowledge-base. These risks are then evaluated, leading either to mitigation strategies or to changing targets and alternatives, and feeding back to the knowledge-base. A case study of affordable homes in hot humid climate is used to test the methodology and demonstrate its application. The case study clearly illustrates the advantages of using the methodology to minimize under performance risks. Further work will follow to develop the underpinning mathematical formalisms of the knowledge base and the risk evaluation procedure., Conference paper, Published.
A pilot scale comparison of the effects of chemical pre-treatments of wood chips on the properties of low consistency refined TMP
Proceedings of the International Mechanical Pulping Conference 2016, IMPC 2016. After decades of research and development, the technology of thermomechanical pulping (TMP) has dramatically improved resulting in higher pulp quality, especially strength. However, the TMP industry is still faced with the challenge of continually increasing energy costs. One approach to reducing the energy costs is to replace the second-stage high consistency (HC) refiner with several low consistency (LC) refiners. This is based on the observation that low consistency refining is more energy efficient than high consistency refining. The limitation of LC refining is loss of paper strength due to the high frequency of fibre cutting especially at high refining intensity. Chemical treatment combined with low consistency refining provides opportunity for even further energy savings. The chemical treatment could improve pulp properties allowing for further energy reduction in the HC refining stage or reduced intensity during LC refining resulting in less fibre cutting. Indeed, it is also possible that the chemical treatment itself will improve the resistance of the fibre to the cutting during LC refining., Conference paper, Published.
Predicting academic performance
Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education. The ability to predict student performance in a course or program creates opportunities to improve educational outcomes. With effective performance prediction approaches, instructors can allocate resources and instruction more accurately. Research in this area seeks to identify features that can be used to make predictions, to identify algorithms that can improve predictions, and to quantify aspects of student performance. Moreover, research in predicting student performance seeks to determine interrelated features and to identify the underlying reasons why certain features work better than others. This working group report presents a systematic literature review of work in the area of predicting student performance. Our analysis shows a clearly increasing amount of research in this area, as well as an increasing variety of techniques used. At the same time, the review uncovered a number of issues with research quality that drives a need for the community to provide more detailed reporting of methods and results and to increase efforts to validate and replicate work., Peer reviewed, Conference paper, Published.
Predictive algorithm for Volt/VAR optimization of distribution networks using Neural Networks
Proceedings of IEEE Canadian Conference on Electrical and Computer Engineering (CCECE2014),May 2014, Toronto, Canada. Smart Grid functions such as Advanced Metering Infrastructure, Pervasive Control and Distribution Management Systems have brought numerous control and optimization opportunities for distribution networks through more accurate and reliable techniques. This paper presents a new predictive approach for Volt/VAr Optimization (VVO) of smart distribution systems using Neural Networks (NN) and Genetic Algorithm (GA). The proposed predictive algorithm is capable of predicting the load profile of target nodes a day ahead by employing the historical metrology data of Smart Meters, It can further perform a comprehensive VVO in order to minimize distribution network loss/operating costs and run Conservation Voltage Reduction (CVR) to conserve more energy. To test the merits of the proposed algorithm, British Columbia Institute of Technology north campus distribution grid is used as research case study., Conference paper, Published.
Preliminary results from field experimental study of rain load and penetration into wood-frame wall systems at window sill defects
14th Canadian Conference on Building Science and Technology, Toronto, Canada, October 29th-30th, 2014. A field study is presented here on the investigation of the correlation between wind-driven rain (WDR) as the driving force and the relative proportions of water penetration at intended defects (openings) located at the interface of windows and exterior walls. In this field study, eight full-scale exterior-wall panels of vinyl siding and stucco claddings were built and installed on a field testing station, which is subjected to British Columbia’s west coast climate rain. This paper focuses on the preliminary results from one of the stucco wall panels with a discontinuity in the sealant around the perimeters of the windows. The water passing through this defect was collected and measured. The instantaneous and automatic water collection measurements were synchronized to the data gathered by a nearby weather station on wind-driven rain intensity, wind speed and direction. In addition, rain gauges on exterior of walls collected the wind-driven rain against each façade of the test station. Compared to previous computer simulations and laboratory experimental studies on rain penetration through exterior walls, this study was conducted under more realistic conditions. The panels were subjected to real wind-driven rain events. Also collectively, the experiment took into account rain that splashed off the wall façade upon impact and the rain water around the defect location due to run-off. The study is ongoing. However, when complete, the results from this study will be useful for fine-tuning the principal moisture load that is applied in hygrothermal performance assessment and design of exterior wall systems., Conference paper, Published.
Program Comprehension: Identifying Learning Trajectories for Novice Programmers
This working group asserts that Program Comprehension (PC) plays a critical part in the writing process. For example, this abstract is written from a basic draft that we have edited and revised until it clearly presents our idea. Similarly, a program is written in an incremental manner, with each step being tested, debugged and extended until the program achieves its goal. Novice programmers should develop their program comprehension as they learn to code, so that they are able to read and reason about code while they are writing it. To foster such competencies our group has identified two main goals: (1) to collect and define learning activities that explicitly cover key components of program comprehension and (2) to define possible learning trajectories that will guide teachers using those learning activities in their CS0/CS1 or K-12 courses. We plan to achieve these goals as follows: Step 1 Review the current state of research and development by analyzing literature on classroom activities that improve program comprehension. Step 2 Concurrently, survey lecturers at various institutions on their use of workshop activities to foster PC. Step 3 Use the outputs from both activities to define and conceptualize what is meant by PC in the context of novice programmers. Step 4 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 5 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 6 Develop a map of learning activities and thereby also models of probable learning trajectories., Not peer reviewed, Conference proceedings
Program comprehension: identifying learning trajectories for novice programmers
This working group asserts that Program Comprehension (PC) plays a critical part in the writing process. For example, this abstract is written from a basic draft that we have edited and revised until it clearly presents our idea. Similarly, a program is written in an incremental manner, with each step being tested, debugged and extended until the program achieves its goal. Novice programmers should develop their program comprehension as they learn to code, so that they are able to read and reason about code while they are writing it. To foster such competencies our group has identified two main goals: (1) to collect and define learning activities that explicitly cover key components of program comprehension and (2) to define possible learning trajectories that will guide teachers using those learning activities in their CS0/CS1 or K-12 courses. We plan to achieve these goals as follows: Step 1 Review the current state of research and development by analyzing literature on classroom activities that improve program comprehension. Step 2 Concurrently, survey lecturers at various institutions on their use of workshop activities to foster PC. Step 3 Use the outputs from both activities to define and conceptualize what is meant by PC in the context of novice programmers. Step 4 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 5 Catalog learning activities with regard to their prerequisites, intended learning outcomes and additional special characteristics. Step 6 Develop a map of learning activities and thereby also models of probable learning trajectories., Not peer reviewed, Conference proceedings
Purification and characterization of a selective growth regulator for human myelopoietic progenitor cells
A monoclonal antibody, named CAMAL-1, was raised previously in our laboratory to a common antigen of acute myeloid leukemia (CAMAL), and was shown to be highly specific in its recognition of cells from patients with acute (AML) or chronic (CML) myelogenous leukemia. CAMAL was also reported to be prognostic of disease, in that patients whose numbers of CAMAL-1 reactive cells were high, or rose over time, had poorer prognoses and shorter survival times than patients whose CAMAL values were low or decreased. This correlation between CAMAL and disease prognosis led to the discovery that CAMAL-1immunoaffinity-purified leukemic cellular lysates contained a selective growth inhibitory activity for normal myeloid progenitor cells, since the growth of CML progenitors was not inhibited. The work described in this thesis focused primarily on the purification and characterization of the myelopoietic activity present in the CAMAL preparations, and its relationship to the leukemic marker (CAMAL). Initial purifications involved CAMAL-1immunoaffinity chromatography of leukemic cellular lysates, followed by FPLC molecular size fractionation and/or preparative SDS-PAGE. The myelopoietic activity was located within a30-35 kDa molecular weight fraction (P30), and the P30 fraction was consistently found to be selective in its inhibition of normal myeloid progenitors, since the growth of CML progenitors was not inhibited but was, in fact, stimulated. Antibodies were raised to P30 and used in the subsequent purification and characterization of the myelopoietic activity. Amino acid sequence analysis of the N-terminus and P30 tryptic peptides strongly suggested that P30 belonged to the serine protease family of enzymes, and the results obtained from protease assays indicated thatP30 preparations did possess enzyme activity. Prior to the completion of P30 molecular cloning experiments, however, the cDNA sequence for azurocidin/CAP37 was reported, and its predicted amino acid sequence was found to be identical to those obtained from the P30 protein samples. Azurocidin is a proteolytically inactive serine protease homologue, normally present in neutrophilic granules. Purifiedazurocidin did not possess inhibitory activity in normal progenitor cell assays; therefore, in order to isolate the biologic activity from azurocidin and other potentially contaminating proteins, P30 preparations were fractionated by reverse phase HPLC. The rpHPLC profiles were found to be similar to those reported for neutrophilic granules; however, the myelopoietica ctivity was obtained in a single rpHPLC fraction that aligned with the front portion of the azurocidin protein peak. Two dimensional isoelectric focusing/SDS-PAGE analysis of the biologically active rpHPLC fraction confirmed that it contained azurocidin, and no additional protein species were detected. Only the earlier eluting azurocidin rpHPLC fraction mediated the myelopoietic activity, and this fraction was also enriched in the higher molecular weight isoforms of azurocidin. Therefore, it appeared that a variably glycosylated isoform of azurocidin was mediating the biologic effects on myeloid progenitor cells, and because azurocidin obtained from normal neutrophils did not possess the myelopoietic activity, we speculate that the bioactive isoform of azurocidin is present in relatively higher amounts and/or is uniquely synthesized by leukemic cells., Thesis, Published.

Pages