scispace - formally typeset
Search or ask a question
Institution

West Virginia University

EducationMorgantown, West Virginia, United States
About: West Virginia University is a education organization based out in Morgantown, West Virginia, United States. It is known for research contribution in the topics: Population & Poison control. The organization has 25632 authors who have published 48308 publications receiving 1343934 citations. The organization is also known as: WVU & West Virginia University, WVU.


Papers
More filters
Journal ArticleDOI
Nicholas J Kassebaum1, Megha Arora1, Ryan M Barber1, Zulfiqar A Bhutta2  +679 moreInstitutions (268)
TL;DR: In this paper, the authors used the Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) for all-cause mortality, cause-specific mortality, and non-fatal disease burden to derive HALE and DALYs by sex for 195 countries and territories from 1990 to 2015.

1,533 citations

Journal ArticleDOI
TL;DR: In this article, the authors surveyed management teams in 102 hotel properties in the United States to examine the intervening roles of knowledge sharing and team efficacy in the relationship between empowering leadership and team performance.
Abstract: We surveyed management teams in 102 hotel properties in the United States to examine the intervening roles of knowledge sharing and team efficacy in the relationship between empowering leadership and team performance. Team performance was measured through a time-lagged market-based source. Results showed that empowering leadership was positively related to both knowledge sharing and team efficacy, which, in turn, were both positively related to performance.

1,470 citations

Proceedings ArticleDOI
13 May 2019
TL;DR: Wang et al. as discussed by the authors proposed a heterogeneous graph neural network based on the hierarchical attention, including node-level and semantic-level attentions, which can generate node embedding by aggregating features from meta-path based neighbors in a hierarchical manner.
Abstract: Graph neural network, as a powerful graph representation technique based on deep learning, has shown superior performance and attracted considerable research interest. However, it has not been fully considered in graph neural network for heterogeneous graph which contains different types of nodes and links. The heterogeneity and rich semantic information bring great challenges for designing a graph neural network for heterogeneous graph. Recently, one of the most exciting advancements in deep learning is the attention mechanism, whose great potential has been well demonstrated in various areas. In this paper, we first propose a novel heterogeneous graph neural network based on the hierarchical attention, including node-level and semantic-level attentions. Specifically, the node-level attention aims to learn the importance between a node and its meta-path based neighbors, while the semantic-level attention is able to learn the importance of different meta-paths. With the learned importance from both node-level and semantic-level attention, the importance of node and meta-path can be fully considered. Then the proposed model can generate node embedding by aggregating features from meta-path based neighbors in a hierarchical manner. Extensive experimental results on three real-world heterogeneous graphs not only show the superior performance of our proposed model over the state-of-the-arts, but also demonstrate its potentially good interpretability for graph analysis.

1,467 citations

Journal ArticleDOI
TL;DR: In this paper, the authors examine the role of monetary rewards in encouraging knowledge sharing in organizations through four mechanisms of knowledge sharing, and propose that team-based rewards and company wide incentives (profit sharing, gainsharing, and employee stock options) would be particularly instrumental in enhancing knowledge sharing within teams and across work units, respectively.
Abstract: This article examines the role of monetary rewards in encouraging knowledge sharing in organizations through four mechanisms of knowledge sharing. We argue that the system of contributing knowledge to databases is the most amenable to rewards contingent on knowledge sharing behaviors because of opportunities for the reward allocator to measure the knowledge sharing behaviors. In the case of formal interactions within or across teams and work units, while rewards could be made partly contingent on knowledge sharing behaviors as in merit pay, rewards based on collective performance are also likely to be effective in creating a feeling of cooperation, ownership, and commitment among employees. In addition, we propose that team-based rewards and company wide incentives (profit sharing, gainsharing, and employee stock options) would be particularly instrumental in enhancing knowledge sharing within teams and across work units, respectively. In the case of knowledge sharing through informal interactions, the ke...

1,442 citations

Journal ArticleDOI
TL;DR: The so-called nonlocally centralized sparse representation (NCSR) model is as simple as the standard sparse representation model, and the extensive experiments validate the generality and state-of-the-art performance of the proposed NCSR algorithm.
Abstract: Sparse representation models code an image patch as a linear combination of a few atoms chosen out from an over-complete dictionary, and they have shown promising results in various image restoration applications. However, due to the degradation of the observed image (e.g., noisy, blurred, and/or down-sampled), the sparse representations by conventional models may not be accurate enough for a faithful reconstruction of the original image. To improve the performance of sparse representation-based image restoration, in this paper the concept of sparse coding noise is introduced, and the goal of image restoration turns to how to suppress the sparse coding noise. To this end, we exploit the image nonlocal self-similarity to obtain good estimates of the sparse coding coefficients of the original image, and then centralize the sparse coding coefficients of the observed image to those estimates. The so-called nonlocally centralized sparse representation (NCSR) model is as simple as the standard sparse representation model, while our extensive experiments on various types of image restoration problems, including denoising, deblurring and super-resolution, validate the generality and state-of-the-art performance of the proposed NCSR algorithm.

1,441 citations


Authors

Showing all 25957 results

NameH-indexPapersCitations
Graham A. Colditz2611542256034
Zhong Lin Wang2452529259003
Michael Kramer1671713127224
Gabriel Núñez148466105724
Darwin J. Prockop12857687066
Adrian Bauman127106191151
Chao Zhang127311984711
Robert J. Motzer12188380129
Mark W. Dewhirst11679757525
Alessandra Romero115114369571
Xiaoming Li113193272445
Stephen M. Davis10967553144
Alan Campbell10968753463
Steven C. Hayes10645051556
I. A. Bilenko10539368801
Network Information
Related Institutions (5)
University of Minnesota
257.9K papers, 11.9M citations

96% related

University of Wisconsin-Madison
237.5K papers, 11.8M citations

95% related

University of Pittsburgh
201K papers, 9.6M citations

94% related

University of Texas at Austin
206.2K papers, 9M citations

94% related

University of North Carolina at Chapel Hill
185.3K papers, 9.9M citations

94% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202386
2022499
20212,766
20202,672
20192,519
20182,416