Posted on: July 15, 2022 Posted by: AKDSEO Comments: 0
  • Mead, C. Neuromorphic electronic systems. Proc. IEEE 78, 1629–1636 (1990).

    Google Scholar 

  • Mead, C. How we created neuromorphic engineering. Nat. Electron. 3, 434–435 (2020).

    Google Scholar 

  • Schuman, C. D., Plank, J. S., Bruer, G. & Anantharaj, J. Non-traditional input encoding schemes for spiking neuromorphic systems. In 2019 International Joint Conference on Neural Networks (IJCNN) 1–10 (IEEE, 2019).

  • Sze, V., Chen, Y.-H., Emer, J., Suleiman, A. & Zhang, Z. Hardware for machine learning: challenges and opportunities. In 2017 IEEE Custom Integrated Circuits Conference (CICC) 1–8 (IEEE, 2017).

  • Mayr, C., Hoeppner, S. & Furber, S. SpiNNaker 2: a 10 million core processor system for brain simulation and machine learning. Preprint at https://arxiv.org/abs/1911.02385 (2019).

  • Furber, S. B., Galluppi, F., Temple, S. & Plana, L. A. The SpiNNaker project. Proc. IEEE 102, 652–665 (2014).

    Google Scholar 

  • Davies, M. et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99 (2018).

    Google Scholar 

  • Mostafa, H., Müller, L. K. & Indiveri, G. An event-based architecture for solving constraint satisfaction problems. Nat. Commun. 6, 1–10 (2015).

    Google Scholar 

  • Amir, A. et al. A low power, fully event-based gesture recognition system. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 7388–7397 (IEEE, 2017).

  • Schuman, C. D. et al. A survey of neuromorphic computing and neural networks in hardware. Preprint at https://arxiv.org/abs/1705.06963 (2017).

  • James, C. D. et al. A historical survey of algorithms and hardware architectures for neural-inspired and neuromorphic computing applications. Biol. Inspired Cogn. Archit. 19, 49–64 (2017).

    Google Scholar 

  • Strukov, D., Indiveri, G., Grollier, J. & Fusi, S. Building brain-inspired computing. Nat. Commun. 10, 4838–2019 (2019).

  • Thakur, C. S. et al. Large-scale neuromorphic spiking array processors: a quest to mimic the brain. Front. Neurosci. 12, 891 (2018).

    Google Scholar 

  • Davies, M. et al. Advancing neuromorphic computing with Loihi: a survey of results and outlook. Proc. IEEE 109, 911–934 (2021).

    Google Scholar 

  • Aimone, J. B. et al. Non-neural network applications for spiking neuromorphic hardware. In Proc. 3rd International Workshop on Post Moores Era Supercomputing 24–26 (PMES, 2018).

  • Polykretis, I., Tang, G. & Michmizos, K. P. An astrocyte-modulated neuromorphic central pattern generator for hexapod robot locomotion on intel’s Loihi. In International Conference on Neuromorphic Systems 2020 1–9 (ACM, 2020).

  • Irizarry-Valle, Y. & Parker, A. C. An astrocyte neuromorphic circuit that influences neuronal phase synchrony. IEEE Trans. Biomed. circuits Syst. 9, 175–187 (2015).

    Google Scholar 

  • Potok, T., Schuman, C., Patton, R. & Li, H. Neuromorphic Computing, Architectures, Models, and Applications. A Beyond-CMOS Approach to Future Computing (US Department of Energy, 2016).

  • Yin, S. et al. Algorithm and hardware design of discrete-time spiking neural networks based on back propagation with binary activations. In 2017 IEEE Biomedical Circuits and Systems Conference (BioCAS) 1–5 (IEEE, 2017).

  • Schemmel, J. et al. A wafer-scale neuromorphic hardware system for large-scale neural modeling. In 2010 IEEE International Symposium on Circuits and Systems (ISCAS) 1947–1950 (IEEE, 2010).

  • Neftci, E. O., Mostafa, H. & Zenke, F. Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process. Mag. 36, 51–63 (2019).

    Google Scholar 

  • Pei, J. et al. Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature 572, 106–111 (2019).

    Google Scholar 

  • Merolla, P. A. et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345, 668–673 (2014).

    Google Scholar 

  • Moradi, S., Qiao, N., Stefanini, F. & Indiveri, G. A scalable multicore architecture with heterogeneous memory structures for dynamic neuromorphic asynchronous processors (DYNAPs). IEEE Trans. Biomed. Circuits Syst. 12, 106–122 (2017).

    Google Scholar 

  • Benjamin, B. V. et al. Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations. Proc. IEEE 102, 699–716 (2014).

    Google Scholar 

  • Schemmel, J., Billaudelle, S., Dauer, P. & Weis, J. Accelerated analog neuromorphic computing. Preprint at https://arxiv.org/abs/2003.11996 (2020).

  • Bohnstingl, T., Scherr, F., Pehle, C., Meier, K. & Maass, W. Neuromorphic hardware learns to learn. Front. Neurosci. 13, 483 (2019).

    Google Scholar 

  • Islam, R. et al. Device and materials requirements for neuromorphic computing. J. Phys. D 52, 113001 (2019).

    Google Scholar 

  • Nandakumar, S., Kulkarni, S. R., Babu, A. V. & Rajendran, B. Building brain-inspired computing systems: examining the role of nanoscale devices. IEEE Nanotechnol. Mag. 12, 19–35 (2018).

    Google Scholar 

  • Najem, J. S. et al. Memristive ion channel-doped biomembranes as synaptic mimics. ACS Nano 12, 4702–4711 (2018).

    Google Scholar 

  • Jo, S. H. et al. Nanoscale memristor device as synapse in neuromorphic systems. Nano Lett. 10, 1297–1301 (2010).

    Google Scholar 

  • Li, Y., Wang, Z., Midya, R., Xia, Q. & Yang, J. J. Review of memristor devices in neuromorphic computing: materials sciences and device challenges. J. Phys. D 51, 503002 (2018).

    Google Scholar 

  • Wu, Y., Deng, L., Li, G., Zhu, J. & Shi, L. Spatio-temporal backpropagation for training high-performance spiking neural networks. Front. Neurosci. 12, 331 (2018).

    Google Scholar 

  • Kulkarni, S. R. & Rajendran, B. Spiking neural networks for handwritten digit recognition–supervised learning and network optimization. Neural Netw. 103, 118–127 (2018).

    Google Scholar 

  • Anwani, N. & Rajendran, B. Training multi-layer spiking neural networks using normad based spatio-temporal error backpropagation. Neurocomputing 380, 67–77 (2020).

    Google Scholar 

  • Bagheri, A., Simeone, O. & Rajendran, B. Training probabilistic spiking neural networks with first-to-spike decoding. In 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2986–2990 (IEEE, 2018).

  • Göltz, J. et al. Fast and deep neuromorphic learning with time-to-first-spike coding. Preprint at https://arxiv.org/abs/1912.11443 (2019).

  • Lee, J. H., Delbruck, T. & Pfeiffer, M. Training deep spiking neural networks using backpropagation. Front. Neurosci. 10, 508 (2016).

    Google Scholar 

  • Lee, C., Sarwar, S. S., Panda, P., Srinivasan, G. & Roy, K. Enabling spike-based backpropagation for training deep neural network architectures. Front. Neurosci. https://doi.org/10.3389/fnins.2020.00119 (2020).

  • Zenke, F. & Neftci, E. O. Brain-inspired learning on neuromorphic substrates. Proc. IEEE 109, 935–950 (2021).

  • Cramer, B., Stradmann, Y., Schemmel, J. & Zenke, F. The Heidelberg spiking data sets for the systematic evaluation of spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. https://doi.org/10.1109/TNNLS.2020.3044364 (2020).

  • Diehl, P. U., Zarrella, G., Cassidy, A., Pedroni, B. U. & Neftci, E. Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware. In 2016 IEEE International Conference on Rebooting Computing (ICRC) 1–8 (IEEE, 2016).

  • Hunsberger, E. & Eliasmith, C. Training spiking deep networks for neuromorphic hardware. Preprint at https://arxiv.org/abs/1611.05141 (2016).

  • Sengupta, A., Ye, Y., Wang, R., Liu, C. & Roy, K. Going deeper in spiking neural networks: VGG and residual architectures. Front. Neurosci. 13, 95 (2019).

    Google Scholar 

  • Severa, W., Vineyard, C. M., Dellana, R., Verzi, S. J. & Aimone, J. B. Training deep neural networks for binary communication with the whetstone method. Nat. Mach. Intell. 1, 86–94 (2019).

    Google Scholar 

  • Rueckauer, B., Lungu, I.-A., Hu, Y., Pfeiffer, M. & Liu, S.-C. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11, 682 (2017).

    Google Scholar 

  • Stöckl, C. & Maass, W. Optimized spiking neurons can classify images with high accuracy through temporal coding with two spikes. Nat. Mach. Intell. 3, 230–238 (2021).

    Google Scholar 

  • Blouw, P., Choo, X., Hunsberger, E. & Eliasmith, C. Benchmarking keyword spotting efficiency on neuromorphic hardware. In NICE ’19: Proc. 7th Annual Neuro-inspired Computational Elements Workshop 1–8 (ACM, 2019).

  • Getty, N., Brettin, T., Jin, D., Stevens, R. & Xia, F. Deep medical image analysis with representation learning and neuromorphic computing. Interface Focus 11, 20190122 (2021).

    Google Scholar 

  • Shukla, R., Lipasti, M., Van Essen, B., Moody, A. & Maruyama, N. Remodel: rethinking deep CNN models to detect and count on a neurosynaptic system. Front. Neurosci. 13, 4 (2019).

    Google Scholar 

  • Tanaka, G. et al. Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019).

    Google Scholar 

  • Kudithipudi, D., Saleh, Q., Merkel, C., Thesing, J. & Wysocki, B. Design and analysis of a neuromemristive reservoir computing architecture for biosignal processing. Front. Neurosci. 9, 502 (2016).

    Google Scholar 

  • Du, C. et al. Reservoir computing using dynamic memristors for temporal information processing. Nat. Commun. 8, 1–10 (2017).

    Google Scholar 

  • Wijesinghe, P., Srinivasan, G., Panda, P. & Roy, K. Analysis of liquid ensembles for enhancing the performance and accuracy of liquid state machines. Front. Neurosci. 13, 504 (2019).

    Google Scholar 

  • Soures, N. & Kudithipudi, D. Deep liquid state machines with neural plasticity for video activity recognition. Front. Neurosci. 13, 686 (2019).

    Google Scholar 

  • Schuman, C. D., Mitchell, J. P., Patton, R. M., Potok, T. E. & Plank, J. S. Evolutionary optimization for neuromorphic systems. In Proc. Neuro-inspired Computational Elements Workshop 1–9 (ACM, 2020).

  • Schaffer, J. D. Evolving spiking neural networks for robot sensory-motor decision tasks of varying difficulty. In Proc. Neuro-inspired Computational Elements Workshop 1–7 (ACM, 2020).

  • Schliebs, S. & Kasabov, N. Evolving spiking neural network–a survey. Evol. Syst. 4, 87–98 (2013).

    Google Scholar 

  • Plank, J. S. et al. The TENNLab suite of LIDAR-based control applications for recurrent, spiking, neuromorphic systems. In 44th Annual GOMACTech Conference (GOMAC Tech, 2019); http://neuromorphic.eecs.utk.edu/raw/files/publications/2019-Plank-Gomac.pdf

  • Mitchell, J. P. et al. Neon: neuromorphic control for autonomous robotic navigation. In 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS) 136–142 (IEEE, 2017).

  • Bi, G.-q & Poo, M.-m Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J. Neurosci. 18, 10464–10472 (1998).

    Google Scholar 

  • Shrestha, A., Ahmed, K., Wang, Y. & Qiu, Q. Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning. In 2017 International Joint Conference on Neural Networks (IJCNN) 1999–2006 (IEEE, 2017).

  • Mozafari, M., Kheradpisheh, S. R., Masquelier, T., Nowzari-Dalini, A. & Ganjtabesh, M. First-spike-based visual categorization using reward-modulated stdp. IEEE Trans. Neural Netw. Learn. Syst. 29, 6178–6190 (2018).

    Google Scholar 

  • Lee, C., Panda, P., Srinivasan, G. & Roy, K. Training deep spiking convolutional neural networks with stdp-based unsupervised pre-training followed by supervised fine-tuning. Front. Neurosci. 12, 435 (2018).

    Google Scholar 

  • Kaiser, J., Mostafa, H. & Neftci, E. Synaptic plasticity dynamics for deep continuous local learning (decolle). Front. Neurosci. 14, 424 (2020).

    Google Scholar 

  • Bellec, G. et al. A solution to the learning dilemma for recurrent networks of spiking neurons. Nat. Commun. 11, 1–15 (2020).

    Google Scholar 

  • Martin, E. et al. EqSpike: spike-driven equilibrium propagation for neuromorphic implementations. iScience 24, 102222 (2021).

    Google Scholar 

  • Mukhopadhyay, A. K., Sharma, A., Chakrabarti, I., Basu, A. & Sharad, M. Power-efficient spike sorting scheme using analog spiking neural network classifier. ACM J. Emerg. Technol. Comput. Syst. 17, 1–29 (2021).

    Google Scholar 

  • Nessler, B., Pfeiffer, M., Buesing, L. & Maass, W. Bayesian computation emerges in generic cortical microcircuits through spike-timing-dependent plasticity. PLoS Comput. Biol. 9, e1003037 (2013).

    MathSciNet 

    Google Scholar 

  • Kasabov, N. K. NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data. Neural Netw. 52, 62–76 (2014).

    Google Scholar 

  • Budhraja, S. et al. Sleep stage classification using neucube on spinnaker: a preliminary study. In 2020 International Joint Conference on Neural Networks (IJCNN) 1–8 (IEEE, 2020).

  • Kumarasinghe, K., Owen, M., Taylor, D., Kasabov, N. & Kit, C. FaNeuRobot: a framework for robot and prosthetics control using the neucube spiking neural network architecture and finite automata theory. In 2018 IEEE International Conference on Robotics and Automation (ICRA) 4465–4472 (IEEE, 2018).

  • Izhikevich, E. M. Polychronization: computation with spikes. Neural Comput. 18, 245–282 (2006).

    MathSciNet 
    MATH 

    Google Scholar 

  • Wang, F., Severa, W. M. & Rothganger, F. Acquisition and representation of spatio-temporal signals in polychronizing spiking neural networks. In Proc. 7th Annual Neuro-inspired Computational Elements Workshop 1–5 (ACM, 2019).

  • Alemi, A., Machens, C., Deneve, S. & Slotine, J.-J. Learning nonlinear dynamics in efficient, balanced spiking networks using local plasticity rules. In Proc. AAAI Conference on Artificial Intelligence Vol. 32 (AAAI, 2018).

  • Maass, W. On the computational power of winner-take-all. Neural Comput. 12, 2519–2535 (2000).

    Google Scholar 

  • Oster, M., Douglas, R. & Liu, S.-C. Computation with spikes in a winner-take-all network. Neural Comput. 21, 2437–2465 (2009).

    MathSciNet 
    MATH 

    Google Scholar 

  • Kappel, D., Nessler, B. & Maass, W. STDP installs in winner-take-all circuits an online approximation to hidden Markov model learning. PLoS Comput. Biol. 10, e1003511 (2014).

    Google Scholar 

  • Gütig, R. & Sompolinsky, H. The tempotron: a neuron that learns spike timing–based decisions. Nat. Neurosci. 9, 420–428 (2006).

    Google Scholar 

  • Bohte, S. M., Kok, J. N. & La Poutre, H. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48, 17–37 (2002).

    MATH 

    Google Scholar 

  • Wang, Q., Rothkopf, C. A. & Triesch, J. A model of human motor sequence learning explains facilitation and interference effects based on spike-timing dependent plasticity. PLoS Comput. Biol. 13, e1005632 (2017).

    Google Scholar 

  • Li, S. & Yu, Q. New efficient multi-spike learning for fast processing and robust learning. In Proc. AAAI Conference on Artificial Intelligence Vol. 34, 4650–4657 (AAAI, 2020).

  • Zenke, F. & Ganguli, S. Superspike: supervised learning in multilayer spiking neural networks. Neural Comput. 30, 1514–1541 (2018).

    MathSciNet 
    MATH 

    Google Scholar 

  • Petro, B., Kasabov, N. & Kiss, R. M. Selection and optimization of temporal spike encoding methods for spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 31, 358–370 (2019).

    Google Scholar 

  • Hamilton, K. E., Mintz, T. M. & Schuman, C. D. Spike-based primitives for graph algorithms. Preprint at https://arxiv.org/abs/1903.10574 (2019).

  • Corder, K., Monaco, J. V. & Vindiola, M. M. Solving vertex cover via ising model on a neuromorphic processor. In 2018 IEEE International Symposium on Circuits and Systems (ISCAS) 1–5 (IEEE, 2018).

  • Kay, B., Date, P. & Schuman, C. Neuromorphic graph algorithms: extracting longest shortest paths and minimum spanning trees. In Proc. Neuro-Inspired Computational Elements Workshop 1–6 (ACM, 2020).

  • Ali, A. & Kwisthout, J. A spiking neural algorithm for the network flow problem. Preprint at https://arxiv.org/abs/1911.13097 (2019).

  • Aimone, J. B. et al. Provable neuromorphic advantages for computing shortest paths. In Proc. 32nd ACM Symposium on Parallelism in Algorithms and Architectures 497–499 (ACM, 2020).

  • Hamilton, K., Date, P., Kay, B. & Schuman D, C. Modeling epidemic spread with spike-based models. In International Conference on Neuromorphic Systems 2020 1–5 (ACM, 2020).

  • Severa, W., Lehoucq, R., Parekh, O. & Aimone, J. B. Spiking neural algorithms for Markov process random walk. In 2018 International Joint Conference on Neural Networks (IJCNN) 1–8 (IEEE, 2018).

  • Smith, J. D. et al. Neuromorphic scaling advantages for energy-efficient random walk computations. Preprint at https://arxiv.org/abs/2107.13057 (2021).

  • Cook, M. Networks of Relations (California Institute of Technology, 2005).

  • Diehl, P. U. & Cook, M. Learning and inferring relations in cortical networks. Preprint at https://arxiv.org/abs/1608.08267 (2016).

  • Diehl, P. U. & Cook, M. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front. Comput. Neurosci. 9, 99 (2015).

    Google Scholar 

  • Alom, M. Z., Van Essen, B., Moody, A. T., Widemann, D. P. & Taha, T. M. Quadratic unconstrained binary optimization (QUBO) on neuromorphic computing system. In 2017 International Joint Conference on Neural Networks (IJCNN) 3922–3929 (IEEE, 2017).

  • Mniszewski, S. M. Graph partitioning as quadratic unconstrained binary optimization (QUBO) on spiking neuromorphic hardware. In Proc. International Conference on Neuromorphic Systems 1–5 (ACM, 2019).

  • Yakopcic, C., Rahman, N., Atahary, T., Taha, T. M. & Douglass, S. Solving constraint satisfaction problems using the Loihi spiking neuromorphic processor. In 2020 Design, Automation & Test in Europe Conference & Exhibition (DATE) 1079–1084 (IEEE, 2020).

  • Mostafa, H., Müller, L. K. & Indiveri, G. Rhythmic inhibition allows neural networks to search for maximally consistent states. Neural Comput. 27, 2510–2547 (2015).

    MathSciNet 
    MATH 

    Google Scholar 

  • Fonseca Guerra, G. A. & Furber, S. B. Using stochastic spiking neural networks on SpiNNaker to solve constraint satisfaction problems. Front. Neurosci. 11, 714 (2017).

    Google Scholar 

  • Pecevski, D., Buesing, L. & Maass, W. Probabilistic inference in general graphical models through sampling in stochastic networks of spiking neurons. PLoS Comput. Biol. 7, e1002294 (2011).

    MathSciNet 

    Google Scholar 

  • Dagum, P. & Luby, M. An optimal approximation algorithm for Bayesian inference. Artif. Intell. 93, 1–27 (1997).

    MathSciNet 
    MATH 

    Google Scholar 

  • Knight, J. C. & Nowotny, T. GPUs outperform current HPC and neuromorphic solutions in terms of speed and energy when simulating a highly-connected cortical model. Front. Neurosci. 12, 941 (2018).

    Google Scholar 

  • Gewaltig, M.-O. & Diesmann, M. NEST (Neural Simulation Tool). Scholarpedia 2, 1430 (2007).

    Google Scholar 

  • Goodman, D. F. & Brette, R. Brian: a simulator for spiking neural networks in python. Front. Neuroinform. 2, 5 (2008).

    Google Scholar 

  • Bekolay, T. et al. Nengo: a python tool for building large-scale functional brain models. Front. Neuroinform. 7, 48 (2014).

    Google Scholar 

  • Stewart, T. C. A Technical Overview of the Neural Engineering Framework (University of Waterloo, 2012).

  • Kulkarni, S. R., Parsa, M., Mitchell, J. P. & Schuman, C. D. Benchmarking the performance of neuromorphic and spiking neural network simulators. Neurocomputing 447, 145–160 (2021).

  • Vetter, J. S. et al. Extreme Heterogeneity 2018-Productive Computational Science in the Era of Extreme Heterogeneity: Report for DOE ASCR Workshop on Extreme Heterogeneity Technical Report (US Department of Energy, 2018).

  • Diamond, A., Nowotny, T. & Schmuker, M. Comparing neuromorphic solutions in action: implementing a bio-inspired solution to a benchmark classification task on three parallel-computing platforms. Front. Neurosci. 9, 491 (2016).

    Google Scholar 

  • Mishkin, D., Sergievskiy, N. & Matas, J. Systematic evaluation of convolution neural network advances on the imagenet. Computer Vis. Image Underst. 161, 11–19 (2017).

    Google Scholar 

  • Orchard, G., Jayawant, A., Cohen, G. K. & Thakor, N. Converting static image datasets to spiking neuromorphic datasets using Saccades. Front. Neurosci. 9, 437 (2015).

    Google Scholar 

  • Tuggener, L., Schmidhuber, J. & Stadelmann, T. Is it enough to optimize CNN architectures on ImageNet? Preprint at https://arxiv.org/abs/2103.09108 (2021).

  • Sandamirskaya, Y. Dynamic neural fields as a step toward cognitive neuromorphic architectures. Front. Neurosci. 7, 276 (2014).

    Google Scholar 

  • Plank, J. S., Zheng, C., Schumann, C. D. & Dean, C. Spiking neuromorphic networks for binary tasks. In International Conference on Neuromorphic Computing Systems (ICONS) 1–8 (ACM, 2021).

  • Smith, J. D. et al. Solving a steady-state PDE using spiking networks and neuromorphic hardware. In International Conference on Neuromorphic Systems 2020 1–8 (ACM, 2020).

  • Aimone, J. B. A roadmap for reaching the potential of brain-derived computing. Adv. Intell. Syst. 3, 2000191 (2021).

    Google Scholar 

  • Douglas, R., Mahowald, M. & Mead, C. Neuromorphic analogue VLSI. Annu. Rev. Neurosci. 18, 255–281 (1995).

    Google Scholar 

  • Parsa, M. et al. Bayesian multi-objective hyperparameter optimization for accurate, fast, and efficient neural network accelerator design. Front. Neurosci. 14, 667 (2020).

    Google Scholar 

  • Parsa, M., Ankit, A., Ziabari, A. & Roy, K. PABO: pseudo agent-based multi-objective bayesian hyperparameter optimization for efficient neural accelerator design. In 2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD) 1–8 (IEEE, 2019).

  • Parsa, M. et al. Bayesian-based hyperparameter optimization for spiking neuromorphic systems. In 2019 IEEE International Conference on Big Data (Big Data) 4472–4478 (IEEE, 2019).

  • Indiveri, G. et al. Neuromorphic silicon neuron circuits. Front. Neurosci. 5, 73 (2011).

    Google Scholar