Please wait a minute...

Frontiers of Engineering Management

Front. Eng    2020, Vol. 7 Issue (2) : 204-222     https://doi.org/10.1007/s42524-019-0040-5
RESEARCH ARTICLE
Convergence to real-time decision making
James M. TIEN()
College of Engineering, University of Miami, Coral Gables, FL 33124, USA
Download: PDF(571 KB)   HTML
Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

Real-time decision making reflects the convergence of several digital technologies, including those concerned with the promulgation of artificial intelligence and other advanced technologies that underpin real-time actions. More specifically, real-time decision making can be depicted in terms of three converging dimensions: Internet of Things, decision making, and real-time. The Internet of Things include tangible goods, intangible services, ServGoods, and connected ServGoods. Decision making includes model-based analytics (since before 1990), information-based Big Data (since 1990), and training-based artificial intelligence (since 2000), and it is bolstered by the evolving real-time technologies of sensing (i.e., capturing streaming data), processing (i.e., applying real-time analytics), reacting (i.e., making decisions in real-time), and learning (i.e., employing deep neural networks). Real-time includes mobile networks, autonomous vehicles, and artificial general intelligence. Central to decision making, especially real-time decision making, is the ServGood concept, which the author introduced in an earlier paper (2012). It is a physical product or good encased by a services layer that renders the good more adaptable and smarter for a specific purpose or use. Addition of another communication sensors layer could further enhance its smartness and adaptiveness. Such connected ServGoods constitute a solid foundation for the advanced products of tomorrow which can further display their growing intelligence through real-time decisions.

Keywords real-time decision making      services      goods      ServGoods      Big Data      Internet of Things      artificial intelligence      wireless communications     
Corresponding Author(s): James M. TIEN   
Just Accepted Date: 07 May 2019   Online First Date: 06 June 2019    Issue Date: 27 May 2020
 Cite this article:   
James M. TIEN. Convergence to real-time decision making[J]. Front. Eng, 2020, 7(2): 204-222.
 URL:  
http://journal.hep.com.cn/fem/EN/10.1007/s42524-019-0040-5
http://journal.hep.com.cn/fem/EN/Y2020/V7/I2/204
Service
E-mail this article
E-mail Alert
RSS
Articles by authors
James M. TIEN
Fig.1  Three-dimensional approach to real-time DM.
Fig.2  3D internet of connected ServGoods.
Fig.3  4D Internet of connected ServGoods.
Focus Tangible goods Intangible services ServGoods Connected ServGoods
Production Pre-produced Co-produced Demand-produced Internet-connected
Variability Identical Heterogeneous Assorted Assorted
Physicality Tangible Intangible Mixed Mixed
Product Inventoryable Perishable Identifiable Identifiable
Objective Reliable Personalizable Adaptable Connectable
Satisfaction Utility-related Expectation-related Satisfaction-related Status-related
Life cycle Recyclable Reusable Flexible Agile
Example Car Electronic-assists IoT Connected autos
Tab.1  Tangible goods, intangible services, ServGoods and connected ServGoods
Focus Scope Tangible goods Intangible services ServGoods Connected ServGoods
Nvidia AI chips X X X X
2. SpaceX Recycled rockets X --- --- ---
3. Amazon AI/Alexa X X X X
4. 23andMe Genome --- X --- ---
5. Alphabet AI Apps X X X X
6. iFlytek China AI Siri X X X X
7. Kite Pharma Cancer therapy --- X --- ---
8. Tencent China AI WeChat X X X X
9. Regeneron Biotech treatments --- X --- ---
10. Spark Biotech treatments --- X --- ---
Tab.2  Top 10 smartest companies in 2017 (MIT, 2017)
Connection Example ServGoods
H2H Intercom; cellular devices; Bluetooth communication; web-based speaker system; electronic voice communication; short message service
H2M E-commerce; occupancy sensor; point-of-sale; social media; webcam; security camera; ATM (automatic teller machine); wearable; laptop
M2H Radio; video internet protocol phone; high power WiFi (wireless networking); e-billboard; RFID (radio frequency identification); alarm system
M2M Connected device; embedded artificially intelligent device; environmental monitoring gadget; video conferencing; AV; networked sensor; drone
Tab.3  Connected ServGoods
Fig.4  Decision making: A 3-tier perspective.
Fig.5  (a) DM: underpinning informatics; (b) DM: Big Data focus; (c) DM: Big Data and artificial intelligence foci.
Fig.6  World-wide growth in digital data. Source: International Data Corporation.
Component Element Model-based analytics Big Data
Acquisition Focus Problem-oriented Data-oriented
Emphasis Data quality Data quantity
Scope Representative sample Large sample
Access Focus On-supply, local-computing On-demand, cloud-computing
Emphasis Over-time accessibility Real-time accessibility
Scope Personal-security Cyber-security
Analytics Focus Analytical elegance Analytical messiness
Emphasis Causative relationship Correlative relationship
Scope Data-rich, information-poor Data-rich, information-unleashed
Application Focus Steady-state optimality Real-time feasibility
Emphasis Model-driven Evidence-driven
Scope Objective findings Good enough results
Tab.4  Data processing approaches: Big Data versus model-based analytics
Component Element Potential concern
Acquisition Focus Big Data does not imply big/complete understanding of underlying problem
Emphasis Big Data quantity does not imply data quality
Scope Big Data proxies or sample does not imply a representative or even a complete sample
Access Focus Big Data’s on-demand accessibility may create privacy concerns
Emphasis Big Data’s real-time abilities may obscure past and future access concerns
Scope Big Data’s cyber-security concerns may overlook personal-security concerns
Analytics Focus Big Data’s inherent messiness may obscure underlying relationships
Emphasis Big Data’s correlational finding may result in an unintended causal consequence
Scope Big Data’s unleashing of information may obscure underlying knowledge
Application Focus Big Data’s feasible explanations may obscure more probable explanations
Emphasis Big Data’s evidence-driven findings may obscure underlying factual knowledge
Scope Big Data’s subjective, consumer-centric findings may obscure simpler objective findings
Tab.5  Big Data: potential concerns (Tien, 2013)
AI category Definition; example ServGoods
Electronic assistants Agents of software that can understand human language then act appropriately; Microsoft’s digital assistant/Cortana, Apple’s HomeKit/Siri, Amazon’s Echo/Alexa, Google’s Home/Assistant
Web bots Software applications that runs automated tasks (i.e., Scripts) over the Internet; Web Crawler, search engine, Facebook Bot, Chatterbot, Twitter Bot, Malicious Bot (Botnet, Zombie Bot, Ticketing Bot, Spambot)
Electromechanical robots Programmable machines which is capable of carrying out a complex series of actions automatically; military quadrupedal machines, drones, mining robots, welding robots, collaborative robots, AVs
Medical devices Appliances intended for diagnostic and/or therapeutic purposes; da Vinci Surgical Assist System, Cloud-Based Health Monitoring System, Glucose Monitor, Insulin Pump, Rehabilitation Robot, Telepresence Robot, Wearables, Defibrillator, Pacemaker
Platforms Users employ pre-built machine learning and DM algorithms and applications; virtual reality, speech recognition, natural language generation, recommendation systems (e.g., Amazon, Netflix), image recognition, predictive analytics, speech synthesis
Tab.6  AI categories and ServGoods
Period Major milestones
Defining (1950?1975) 1950 (Alan Turing introduces “machine thinking”; 1955 (John McCarthy, Marvin Minsky, and Claude Shannon met at Dartmouth to coin “AI”); 1961 (Marvin Minsky publishes “Steps toward artificial intelligence”); 1972 (Hubert Dreyfus publishes “What computers can’t do”)
Winter (1975?2000) 1979 (Backgammon program by Hans Berliner defeated human champion); 1997 (IBM’s Deep Blue chess computer defeated world champion Garry Kasparov in Rematch)
Renaissance (2000?present) 2004 (DARPA sponsors driverless car competition across Mohave Desert); 2011 (IBM’s Watson defeated Jeopardy! champions); 2016 (Google’s AlphaGo beat Korea’s Go champion)
Tab.7  AI time line
Approach (reference) Scope
Neural Network (McCulloch & Pitts, 1943) In a biological neural network, neurons (analogous to axons in a biological brain) are interconnected by synapses; when activated, such a network of neurons or cells can carry out a specific function
ANN (Kleene 1956) Through backpropagation of errors, artificial neurons learn to perform a task (e.g., recognizing a cat) by considering many examples; the neurons are aggregated into layers with weights that adjust as learning proceeds
Machine Learning
(Arthur 1959)
Employs statistical techniques to automate analytical model building, without being explicitly programmed; it is about learning from example data, identifying patterns and making decisions
Deep learning
(Schmidhuber 2015)
Also known as deep neural or/belief networks, deep learning is about understanding data representations, as opposed to task-specific algorithms, through supervised, semi-supervised or unsupervised means
Tab.8  AI learning approaches
Focus Scope
1. Manufacturing Semiconductor yields; supply chains; product quality
2. Professional Services Process streamlining; design thinking; digital consulting
3. Wholesale & Retail Wholesale distribution; retail space; customer service
4. Information & Communication Cloud management; corporate communications; information management
5. Financial Services Digital banking; algorithmic trading; robo-advisors
6. Construction Smart construction; building information modeling; construction management
7. Healthcare Drug discovery; CRISPR; dental implants; wearables
8. Lodging & food services Velociraptor receptionists; staffing schedules; food and beverage forecasts
9. Utilities Power production; power transmission; power consumption
10. Education Personal assistants; individualized learning; learning analytics
Tab.9  Projected top 10 output sectors benefitting from AI by 2035 (Accenture, 2017)
Fig.7  AI Applications: difficulty versus Value.
Component Element Big Data AI
Acquisition Focus Data-oriented Learning example-oriented
Emphasis Data quantity Learning example quantity
Scope Large sample Increased learning layers
Access Focus On-demand, cloud-computing On-demand, cloud-computing
Emphasis real-time accessibility Real-time accessibility
Scope cyber-security Cyber-security
Analytics Focus Analytical messiness Includes non-transparent layers
Emphasis Correlative relationship Reinforced learning examples
Scope Data-rich, information-unleashed Example-rich, lessons-learned
Application Focus Real-time feasibility Real-time feasibility
Emphasis Evidence-driven Learning-driven
Scope Good enough results Good enough results
Tab.10  Data processing approaches: AI versus Big Data
Component Element Potential concern
Acquisition Focus AI does not imply complete understanding of underlying problem
Emphasis AI’s learning quantity does not imply learning quality
Scope AI’s learning sample does not imply a complete sample
Access Focus AI’s on-demand accessibility may create privacy concerns
Emphasis AI’s real-time abilities may obscure past and future access concerns
Scope AI’s cyber-security concerns may overlook personal-security concerns
Analytics Focus AI’s hidden or non-transparent layers may obscure underlying relationships
Emphasis AI’s example reinforcements may result in an unintended causal understanding
Scope AI’s lessons learned may obscure underlying information and knowledge
Application Focus AI’s non-obvious explanations may obscure more probable explanations
Emphasis AI’s non-transparent findings may obscure underlying biases
Scope AI’s good enough findings may obscure simpler transparent findings
Tab.11  AI: potential concerns
Range/GHz Example applications
0.0–1.0 AM radio (540?1600 kHz), broadcast TV (3 kHz), FM radio (88.1?108.1 MHz), cell phone (800 MHz)
1.0–2.0 GPS (1.22 and 1.57 GHz), cell phone (1.0 GHz)
2.0–300.0 WiFi (2.4, 3, 5, and 60 GHz), satellite radio (2.3 GHz), Bluetooth, microwave oven
Above 300.0 Ultraviolet, infrared, visible light, Gamma rays, X-rays
Tab.12  Frequency spectrum applications
Inaugural date: generation (speed in kbps) Description of features
1980: 1G (2.4) Issues (possible eavesdropping by all-band radio receiver); basic analog voice service; promulgated by Japan’s NTT; analog protocols; growth fueled by cellular phones (an area is divided into cells)
1990: 2G (64) Benefits (digital encryption, short message service); basic digital voice service; promulgated by Finland’s Radiolinja; first digital standards (GPRS, GSM, CDMA, EDGE); improved coverage and capacity
2000: 3G (2000) First mobile broadband (wide band CDMA, WLAN, Bluetooth, global roaming, UMTS, HSDPA, Internet, video conferencing calls); basic voice service with some data (multimedia, text, etc.); promulgated by IMT
2010: 4G (100000) High Mobility Communications (Cars, Trains, HD Mobile TV); True Mobile Broadband; Internet-Based Protocols (WiMax, LTE); primarily for digital data; promulgated by IMT-advanced; growth fueled by smartphones
2020: 5G (to be detailed) Expected by South Korea; more reliable service; small cells; millimeter waves; faster speed (e.g., download an HD movie in 1 s versus 10 min under 4G); full duplex; beamforming; robust GPS; massive MIMO
Tab.13  Generations of wireless networks
Area Example applications
Cellular telephony Clock synchronization enables time transfer to facilitate inter-cell handoff and support hybrid GPS/cellular position detection for geotagging and locating emergency callers
Celestial astronomy Positional and clock synchronization are employed in astrometry and celestial mechanics, including the discovery of extrasolar planets and the location of atmospheric conditions
Robotic navigation Autonomous, real-time location and identification of routes for cars, trucks, planes to function without a human driver and without collisions
Site-specific management Modern agriculture is becoming more efficient (in fuel consumption), effective (in water use), and precision-based (in produce yield)
Tab.14  GPS applications
AI focus Example autonomous ServGoods
Space Probe that can explore the surface of a planet and collect samples
Architecture Windows that can adapt to changing light, heat, etc.
Agriculture Drone that can control weeds, seed clouds, analyze soil, etc.
Infrastructure Dam that can maintain reservoir level and prevent flooding
Delivery Driverless vehicle or drone that can deliver packages to an address
Tab.15  Example autonomous ServGoods
Level Automation National Highway Traffic Safety Administration (2013) definition Driver (period)
0 No-automation The driver is in complete and sole control of the primary vehicle controls (brake, steering, throttle, and motive power) at all times Driver fully involved (from pre-2000)
1 Function-specific Automation Automation at this level involves one or more specific control functions. (Examples include electronic stability control or pre-charged brakes, where the vehicle automatically assists with braking to enable the driver to regain control of the vehicle or stop faster than possible by acting alone.) Driver mostly involved (from 2000 to 2010)
2 Combined function automation This level involves automation of at least two primary control functions designed to work in unison to relieve the driver of control of those functions. (An example of combined functions enabling a Level 2 system is adaptive cruise control in combination with lane centering.) Driver partially involved (from 2010 to 2020)
3 Limited self-driving automation Vehicles at this level of automation enable the driver to cede full control of all safety-critical functions under certain traffic or environmental conditions and in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control. The driver is expected to be available for occasional control, but with sufficiently comfortable transition time. (The Google car is an example of limited self-driving automation.) Driver mostly disengaged (from 2020 to 2030)
4 Full self-driving automation The vehicle is designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip. Such a design anticipates that the driver will provide destination or navigation input, but is not expected to be available for control at any time during the trip. This includes both occupied and unoccupied vehicles. (Alternative definitions include driverless, no-wheel and AV.) Driver fully disengaged (post-2030)
Tab.16  AV: levels of automation
System Scope
Parking assist On board cameras/sensors detect a suitable parking spot; car takes control of steering and braking so as to ease itself into place for either parallel or perpendicular parking.
Active cruise
control
Allowing for varying gap between car and vehicle before it, system can also take the car to a full stop and allow it to creep slowly ahead, so that driver never has to touch the brake or accelerator in stop-and-go highway traffic
Blind spot assist Cameras/sensors installed in side view mirror can tell when a vehicle pulls alongside car and can steer car back into lane if a vehicle is detected in adjacent lane
Forward collision prevention Cameras/sensors can warn driver of potential forward or side collisions with other vehicles, cyclists and pedestrians; brakes are applied to prevent or mitigate an accident or a frontal collision
Lane drift warning Lane departure warning systems use cameras mounted on side/rearview mirrors to watch for lane markings and warn driver when a tire is about to unintentionally drift over them; wheels are then steered back
Tab.17  Example auto assist systems
Discipline Example technology Scope
Biomedical 1. Biological sensors;
2. Driver sensors;
3. Metrology tools
1. Detecting toxic substances & driver behavior;
2. Wearables, embeddables, tattoos, etc.;
3. Testing and measuring of nanoscale features
Chemical 1. Sensing “snow smoke”;
2. Underwater sensors;
3. Chemical simulation
1. Including fuel leaks & hybrid energy sources;
2. Including hydrothermal vents, chemically altered water, etc.;
3. Neural network modeling & control
Computer science 1. AI;
2. Learning algorithms;
3. Data fusion
1. Machine learning & virtual personal assistants (e.g., Apple’s Siri);
2. Develop intuitive & predictive learning algorithms;
3. Sources: sensors, actuators, smartphones, people, etc.
Communications 1. V2X techniques;
2. Environmental sensing;
3. Real-time sensing
1. Where X=driver, vehicle, environment, infrastructure, or cloud;
2. Constant awareness of surroundings through radar, lidar, sonar, etc.;
3. Enhanced sensor range to keep car & driver constantly informed
Electrical 1. Adaptive headlights;
2. Cyber security protocols;
3. Electronic standards
1. For safety, sensors measure speed, steering angle and yaw;
2. Employ cyber techniques to keep vehicle secure inside and out;
3. Smart meters, electronic devices, energy sources, etc.
Energy 1. Energy standards;
2. Energy alternatives;
3. Energy considerations
1. Minimize CO2 emissions, meet Paris climate agreement of 2016, etc.;
2. Gas-only, hybrid (gas & battery), plug-in hybrid, all-electric, etc.;
3. Learning from unmanned air & sea vehicles (e.g., event data recorder)
Environmental 1. Human health monitors;
2. Vehicle presence;
3. Environmental monitors
1. Self-powered & embedded sensors which monitor human health;
2. Ensure that vehicle projects an interference-free & clear signal;
3. Self-powered and embedded sensors which monitor environment
Industrial 1. Simulation tools;
2. Decision analytics;
3. Human factors
1. Optimize service & product designs through simulation;
2. Cognitive computing & vehicle routing software;
3. Designing security & privacy that match user needs & expectations
Material 1. Nano particles;
2. Surface plasmonics;
3. Carbon fibre
1. Manufacturing of vehicular components using nano-particles;
2. Coupling light with nano-oscillations of information-laden electrons.;
3. Light-weight, high-strength & high-performance carbon material
Mechanical 1. Antilock brakes;
2. 3D printing;
3. Cruise control
1. Allows wheels to maintain road contact while?braking & turning.;
2. Personalized design using layer-on-layer of nanomaterial;
3. A servomechanism that maintains the speed of a vehicle
Tab.18  AV: discipline-based example innovations
Issue Scope
Current
AI
Today’s narrow or weak AI system is limited in its scope of possible, non-integrated actions
Today’s AI intelligence is certainly not that of a human brain; it is closer to that of a worm
AGI
Integration
AGI should be able to integrate sight, hearing, smell, taste, touch, judgment, action, etc.
In theory, AGI may concurrently control a person’s vehicle, smartphone, computer, pacemaker, etc.
AGI
Threats
Under a totalitarian regime, AGI could make “1984” a reality or result in a technological singularity
AGI can become an existential threat to humanity if it is able to recursively improve on itself
AGI
Limitations
AGI restrictions: United Kingdom Committee on AI; European Union’s GDPR
AGI actions must be the result of AI-human partnering or true collaboration
AGI
Cautions
Caution should be exercised, especially when the reasoning behind an AI action is not transparent
To minimize unintended consequences, AI developers must become more responsible
Tab.19  AGI: observations
1 Accenture (2017). Impact of Artificial Intelligence on Industry Growth by 2035. Report
2 J M Anderson, N Kalra, K D Stanley, P Sorenson, C Samaras, O A Oluwatola (2014). Autonomous Vehicle Technology: a Guide for Policymakers. Santa Monica: The RAND Corporation
3 A L Samuel (1959). Some studies in machine learning using the game of checkers. IBM Journal of Research and Development, 3(3): 210–229
https://doi.org/10.1147/rd.33.0210
4 R D Atkinson (2016). ‘It’s Going to Kill Us!’ and Other Myths About the Future of AI. Information Technology & Innovation Foundation
5 I Azimov (1950). I, Robot. New York: Gnome Press
6 D Castro, J New (2016). The Promise of Artificial Intelligence. Washington DC/Brussels: Center for Data Innovation
7 H Dreyfus (1972). What Computers Can’t Do. New York: MIT Press
8 B Gholami, W M Haddad, J M Bailey (2018). AI in the ICU: in the intensive care unit, artificial intelligence can keep watch. IEEE Spectrum, 55(10): 31–35
https://doi.org/10.1109/MSPEC.2018.8482421
9 J Hendler, A M Mulvehill (2016). Social Machines: the Coming Collision of Artificial Intelligence, Social Networking, and Humanity. New York: Apress
10 House of Lords Select Committee on Artificial Intelligence (2018). Five Proposed Principles for an AI Code. House of Lords of the United Kingdom Report
11 S C Kleene (1956). Representation of events in nerve nets and finite automata. In: Shannon C E, McCarthy J, eds. Automata Studies. Princeton: Princeton University Press, 3–41
12 J McCarthy, M L Minsky, N Rochester, C E Shannon (1955). A proposal for the Dartmouth research project on artificial intelligence. Republished in 2006. AI Magazine, 27(4): 11–14
13 W S McCulloch, W Pitts (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5(4): 115–133
https://doi.org/10.1007/BF02478259
14 M Minsky (1961). Steps toward artificial intelligence. Proceedings of the IRE, 49(1): 8–30
https://doi.org/10.1109/JRPROC.1961.287775
15 MIT (2017). 50 smartest companies in 2017. MIT Technology Review, 120(4): 54–57
16 L Mlodinow (2012). Subliminal: How Your Unconscious Mind Rules Your Behavior. New York: Pantheon Books
17 National Highway Traffic Safety Administration (2013). U.S. Department of Transportation Releases Policy on Automated Vehicle Development. Washington, DC: NHTSA
18 G Orwell (1949). 1984. London: Secker and Warburg
19 B W Parkinson, J J Spilker (1996). Global Positioning System: Theory and Applications. Reston: American Institute of Aeronautics and Astronautics
20 T S Perry (2018). GPS’ navigator in chief. IEEE Spectrum, 55(5): 46–51
https://doi.org/10.1109/MSPEC.2018.8352575
21 P E Ross (2015). Diabetes has a new enemy: robo-pancreas. IEEE Spectrum, 52(6): 40–44
https://doi.org/10.1109/MSPEC.2015.7115563
22 J Schmidhuber (2015). Deep learning in neural networks: an overview. Neural Networks, 61: 85–117
https://doi.org/10.1016/j.neunet.2014.09.003
23 M Tegmark (2018). Life 3.0: Being Human in the Age of Artificial Intelligence. New York: Knopf Doubleday Publishing Group
24 J M Tien (2003). Toward a decision informatics paradigm: a real-time information based approach to decision making. IEEE Transactions on Systems, Man and Cybernetics. Part C, Applications and Reviews, 33(1): 102–113
https://doi.org/10.1109/TSMCC.2003.809345
25 J M Tien (2012). The next industrial revolution: integrated services and goods. Journal of Systems Science and Systems Engineering, 21(3): 257–296
https://doi.org/10.1007/s11518-012-5194-1
26 J M Tien (2013). Big Data: unleashing information. Journal of Systems Science and Systems Engineering, 22(2): 127–151
https://doi.org/10.1007/s11518-013-5219-4
27 J M Tien (2014). Overview of big data: a US perspective. Bridge, 44(4): 12–19
28 J M Tien (2015). Internet of connected ServGoods: considerations, consequences and concerns. Journal of Systems Science and Systems Engineering, 24(2): 130–167
https://doi.org/10.1007/s11518-015-5273-1
29 J M Tien (2016). The sputnik of ServGoods: autonomous vehicles. Journal of Systems Engineering, 26(2): 10–38
30 J M Tien (2017). Internet of things, real-time decision making, and artificial intelligence. Annals of Data Science, 4(2): 149–178
https://doi.org/10.1007/s40745-017-0112-5
31 A Turing (1950). Computing machinery and intelligence. Mind, LIX(236): 433–460
https://doi.org/10.1093/mind/LIX.236.433
Related articles from Frontiers Journals
[1] Yongkui LI, Qing YANG, Beverly PASIAN, Yan ZHANG. Project management maturity in construction consulting services: Case of Expo in China[J]. Front. Eng, 2020, 7(3): 384-395.
[2] Xiangpei HU, Lijun SUN, Yaxian ZHOU, Junhu RUAN. Review of operational management in intelligent agriculture based on the Internet of Things[J]. Front. Eng, 2020, 7(3): 309-322.
[3] Moulay Larbi CHALAL, Benachir MEDJDOUB, Nacer BEZAI, Raid SHRAHILY. Big Data to support sustainable urban energy planning: The EvoEnergy project[J]. Front. Eng, 2020, 7(2): 287-300.
[4] Feng YANG, Manman WANG. A review of systematic evaluation and improvement in the big data environment[J]. Front. Eng, 2020, 7(1): 27-46.
[5] Xiaohong CHEN. The development trend and practical innovation of smart cities under the integration of new technologies[J]. Front. Eng, 2019, 6(4): 485-502.
[6] Donald KENNEDY, Simon P. PHILBIN. The imperative need to develop guidelines to manage human versus machine intelligence[J]. Front. Eng, 2018, 5(2): 182-194.
[7] Jing LIN, Uday KUMAR. IN2CLOUD: A novel concept for collaborative management of big railway data[J]. Front. Eng, 2017, 4(4): 428-436.
[8] Xiqun (Michael) CHEN, Xiaowei CHEN, Hongyu ZHENG, Chuqiao CHEN. Understanding network travel time reliability with on-demand ride service data[J]. Front. Eng, 2017, 4(4): 388-398.
[9] Jonathan Jingsheng SHI, Saixing ZENG, Xiaohua MENG. Intelligent data analytics is here to change engineering management[J]. Front. Eng, 2017, 4(1): 41-48.
[10] Li Da Xu. An Internet-of-Things Initiative for One Belt One Road (OBOR)[J]. Front. Eng, 2016, 3(3): 206-223.
[11] Yong Shi. Challenges to Engineering Management in the Big Data Era[J]. Front. Eng, 2015, 2(3): 293-303.
[12] Lie-yun Ding,Sheng-yu Guo. Study on Big Data-based Behavior Modification in Metro Construction[J]. Front. Eng, 2015, 2(2): 131-136.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed