Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Ang Yu; Jiwei Zhao Abstract: Sociological Methodology, Ahead of Print. This article addresses two prominent theses in social stratification research, the great equalizer thesis and Mare’s school transition thesis. Both theses describe the role of an intermediate educational transition in the association between socioeconomic ... Citation: Sociological Methodology PubDate: 2025-06-09T12:43:35Z DOI: 10.1177/00811750251340392
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Miranda J. Lubbers; Michał Bojanowski, Nuria Targarona Rifa, Alejandro Ciordia Abstract: Sociological Methodology, Ahead of Print. Aggregated relational data (ARD), derived from questions of the form “How many people do you know who [belong to subpopulation X]'” are widely used to estimate the size and composition of social networks, often adopting the network scale-up method (NSUM). ... Citation: Sociological Methodology PubDate: 2025-06-05T10:03:35Z DOI: 10.1177/00811750251340398
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Alexis Rowland; Naomi F. Sugie, Kristin Turney Abstract: Sociological Methodology, Ahead of Print. Research, advocacy, and archival projects related to incarceration often lack knowledge about the ongoing conditions of carceral facilities, and myriad challenges prevent stakeholders from successfully conducting outreach with incarcerated people. Using a ... Citation: Sociological Methodology PubDate: 2025-04-16T12:45:13Z DOI: 10.1177/00811750251331970
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Anders Holm, Ben Jann, Kristian Bernt Karlson; Ben Jann, Kristian Bernt Karlson Abstract: Sociological Methodology, Ahead of Print. The proportion of explained variance is well defined in linear models, but Snijders and Bosker demonstrated that this concept is ill defined in linear multilevel models. Whenever a researcher adds a level 1 predictor to the model, the level 2 variance may increase because the level 2 variance also depends on the level 1 variance. This problem is more pronounced when there are few observations per cluster. The authors present a solution that allows researchers to decompose variance components from null models into parts explained and unexplained by level 1 predictors. The authors also offer an extension that incorporates level 2 predictors. This approach is based on multivariate multilevel modeling and provides a complete decomposition of the gross (or null model) variance components. The approach is also implemented in the user-written Stata program twolevelr2, and the online supplement contains worked code for implementation in R. The authors illustrate this method with an example analyzing sibling similarities in lifetime income. Citation: Sociological Methodology PubDate: 2025-02-28T10:17:24Z DOI: 10.1177/00811750251322786
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:R. Gordon Rinderknecht, Long Doan, Liana C. Sayer; Long Doan, Liana C. Sayer Abstract: Sociological Methodology, Ahead of Print. Amazon’s Mechanical Turk (MTurk) and Prolific are popular online platforms for connecting academic researchers with respondents. A broad literature has sought to assess the extent to which these respondents are representative of the U.S. population in terms of their demographic background, yet no work has assessed the representativeness of their daily lives. The authors provide this analysis by collecting time diaries from 136 MTurk and 156 Prolific respondents, which they compare with diary responses from 468 contemporaneous responses to the American Time Use Survey (ATUS). Responses from MTurk and Prolific respondents include several notable differences relative to ATUS responses, including doing less housework and care work, spending less time traveling, spending more time at home, and spending more time alone. In general, MTurk respondents worked more than ATUS respondents, and Prolific respondents spent more time in leisure. These differences persist even after adjusting for demographic differences. The present findings highlight time use as a potential major source of differences across samples that go beyond demographic differences. Thus, scholars interested in these samples should consider how time use may moderate processes of interest. Citation: Sociological Methodology PubDate: 2025-01-25T09:51:07Z DOI: 10.1177/00811750241312226
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Jaclyn S. Wong, Lauren Valentino, Christina Pao, Katie Donnelly Moran, D’Lane Compton, Gayle Kaufman; Lauren Valentino, Christina Pao, Katie Donnelly Moran, D’Lane Compton, Gayle Kaufman Abstract: Sociological Methodology, Ahead of Print. Measuring social categories and phenomena in survey questionnaires is complicated in a dynamic and diverse society. The use of “other, describe”–style open-ended text boxes can address this issue, but researchers rarely use these data because they lack guidance on how to code open-text information and manage small samples for analyses. The authors offer a roadmap to key decisions regarding the use of “other, describe” answer options and data during three stages of research: data collection and instrument development, data cleaning, and data analysis. The authors then use two cases, the Perceptions of Housework Pilot Survey and the Perceptions of Discrimination Study, to illustrate how decision making about open-text responses to demographic survey questions unfolded. These cases highlight the complexities and trade-offs in decision making and their implications for knowledge production. The authors recommend meta-practices for flexibility, transparency, and reflexivity, as there is not one “right choice” when it comes to using “other, describe” data. Citation: Sociological Methodology PubDate: 2024-12-27T12:47:41Z DOI: 10.1177/00811750241304774
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Anna-Carolina Haensch, Reinhard Schunck; Reinhard Schunck Abstract: Sociological Methodology, Ahead of Print. Systematically missing information on secondary respondents is a frequent problem in multiactor surveys. Budget and time constraints often prevent all variables collected for primary respondents (e.g., anchors) from being collected for secondary respondents (e.g., partners). Thus, a subset of variables are systematically missing for secondary respondents. This can severely limit the analysis potential of multiactor data, ruling out all research questions that would require (the same) information on primary and secondary respondents. The problem of systematically missing data is also present in other settings, for example, after changes in measurement instruments in repeated surveys or in ex post survey harmonization if one or more surveys did not include a specific variable. In these cases, using multiple imputation (MI) techniques to impute the missing variables is a common approach. The authors explore whether MI can be used when data on secondary respondents are systematically missing. Results from simulation studies show that imputation under the assumption of conditional independence for primary and secondary respondents variables leads to a strong bias toward zero in the estimated partial correlation between primary and secondary respondents. However, external data in the form of bridging studies can be used to estimate the partial correlation between the observed variable for the primary and the unobserved variable for the secondary respondent, leading to estimates with less bias after MI. Citation: Sociological Methodology PubDate: 2024-12-02T10:22:53Z DOI: 10.1177/00811750241299816
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Sarah Brothers, Caty Simon, Louise Vincent; Caty Simon, Louise Vincent Abstract: Sociological Methodology, Ahead of Print. Sociological approaches to digital and community-engaged research experienced significant innovation in recent years. This article examines developing and implementing a primarily virtual community-driven research (CDR) project with the National Survivors Union, the American national drug-users union, during the COVID-19 pandemic. Relationships between researchers and directly impacted people, such as people who use drugs, face many barriers. These issues were exacerbated during COVID-19 when in-person research decreased while drug-related harms increased. In response, this project modified the CDR model for drug-use research. The CDR model is particularly beneficial for studies with marginalized populations who may mistrust researchers. In CDR, impacted community members are fundamental project drivers. This project’s data are based on 29 months of weekly group meetings in National Survivors Union online spaces, group and individual text conversations, phone calls, and shared-document group work. The project co-developed methods for CDR with directly impacted people, including community-initiated research questions, low-threshold methods, collaborative writing strategies, coauthorship practices foregrounding directly impacted perspectives, and multiple dissemination forms. Modified CDR expands sociological methods for digital research, citizen science, and community-engaged research with vulnerable, criminalized groups. This approach may aid inclusive, innovative sociological scholarship and effective public health policy for reducing morbidity and mortality during multiple crises. Citation: Sociological Methodology PubDate: 2024-09-30T05:48:50Z DOI: 10.1177/00811750241281063
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Brian C. Kelly, Christie Sennott; Christie Sennott Abstract: Sociological Methodology, Ahead of Print. The qualitative interview has been a core technique in the sociological methods toolkit for generations. Interviews provide essential insights into how participants experience the world around them. New opportunities have emerged to adapt traditional in-depth interview techniques through the use of evolving technologies available to interview participants. This article describes the integration of ecological momentary assessment techniques to augment qualitative in-depth interviews focused on specific events, which we term event-centered interviewing. By incorporating photo data captured systematically through smartphone apps designed for ecological momentary assessment, event-centered interviews can extend the strengths of traditional qualitative interviews. We describe the processes and procedures for conducting event-centered interviews, and we highlight how the approach may create opportunities for qualitative analysis and minimize certain limitations of traditional in-depth interviews. We also highlight the positive participant responses to the approach from a pilot study. Although traditional in-depth interviews may remain at the core of qualitative sociological inquiry, event-centered interviewing may be especially useful for interviews about behavior and experiences that occur during specific events. Citation: Sociological Methodology PubDate: 2024-09-28T07:24:14Z DOI: 10.1177/00811750241283743
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Björn Rohr, Henning Silber, Barbara Felderer; Henning Silber, Barbara Felderer Abstract: Sociological Methodology, Ahead of Print. Previous studies have shown many instances where nonprobability surveys were not as accurate as probability surveys. However, because of their cost advantages, nonprobability surveys are widely used, and there is much debate over the appropriate settings for their use. To contribute to this debate, we evaluate the accuracy of nonprobability surveys by investigating the common claim that estimates of relationships are more robust to sample bias than means or proportions. We compare demographic, attitudinal, and behavioral variables across eight German probability and nonprobability surveys with demographic and political benchmarks from the microcensus and a high-quality, face-to-face survey. In the analyses, we compare three types of statistical inference: univariate estimates, bivariate Pearson’s r coefficients, and 24 different multiple regression models. The results indicate that in univariate comparisons, nonprobability surveys were clearly less accurate than probability surveys when compared with the population benchmarks. These differences in accuracy were smaller in the bivariate and the multivariate comparisons across surveys. In addition, the outcome of those comparisons largely depended on the variables included in the estimation. The observed sample differences are remarkable when considering that three nonprobability surveys were drawn from the same online panel. Adjusting the nonprobability surveys somewhat improved their accuracy. Citation: Sociological Methodology PubDate: 2024-09-28T07:20:54Z DOI: 10.1177/00811750241280963
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Bolun Zhang, Yimang Zhou, Dai Li; Yimang Zhou, Dai Li Abstract: Sociological Methodology, Ahead of Print. Validation is at the heart of methodological discussions about topic modeling. The authors argue that validation based on human reading hinges on distinctive words and readers’ labeling of a topic, and it overlooks the probability of conflicting results from semantically similar models, such as regressions or other methods. This runs counter to the presumption that topic modeling can reveal features of documents that have some measurable association with social aspects outside the text. The authors develop a similar topic identifying procedure to verify that semantically similar solutions yield similar results in further analysis. The authors argue that future validations of topic modeling must consider such procedures. Citation: Sociological Methodology PubDate: 2024-07-25T10:42:32Z DOI: 10.1177/00811750241265336
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Siwei Cheng, Andrew Levine, Ananda Martin-Caughey; Andrew Levine, Ananda Martin-Caughey Abstract: Sociological Methodology, Ahead of Print. In addition to overall dispersion, the distributional shape of economic status has attracted growing attention in the inequality literature. Economic polarization is a specific form of distributional change, characterized by a shrinking middle of the distribution and a growing top and bottom, with potentially important and unique social consequences. Building on relative distribution methods and drawing from the literature on job polarization, the authors develop an approach for analyzing economic polarization at the individual level. The method has three useful features. First, it offers intuitive and flexible measurement of economic polarization both between and within categories. Second, it helps disentangle two potential sources of economic polarization: compositional change, which involves changes to the allocation of workers across categories, and relative economic status change, which involves changes to the allocation of economic rewards between individuals. Third, it enables researchers to uncover and examine potential heterogeneity in economic polarization, for example, across occupations, geographic units, demographic and educational groups, and firms. The authors demonstrate the utility of this approach through two empirical applications: (1) an analysis of trends in wage polarization between and within occupations and (2) an examination of geographic variation in income polarization. Citation: Sociological Methodology PubDate: 2024-07-25T10:36:10Z DOI: 10.1177/00811750241260731
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Moeen Mostafavi, Michael D. Porter, Dawn T. Robinson; Michael D. Porter, Dawn T. Robinson Abstract: Sociological Methodology, Ahead of Print. The authors introduce BERTNN (Bidirectional Encoder Representations from Transformers Neural Network), a novel methodology designed to expand affective lexicons, a critical component in sociological research. BERTNN estimates the affective meanings and their distribution for new concepts, bypassing the need for extensive surveys by leveraging their contextual usage in language. The cornerstone of BERTNN is the use of nuanced word embeddings from Bidirectional Encoder Representations from Transformers. BERTNN uniquely encodes words within the framework of synthesized social event sentences, preserving their meaning across actor-behavior-object positions. The model is fine-tuned on the basis of the implied sentiment changes, providing a more refined estimation of affective meanings. BERTNN outperforms previous approaches, setting a new standard in deriving multidimensional affective meanings for novel concepts. It efficiently replicates sentiment ratings that traditionally require extensive survey hours, demonstrating the power of automated modeling in sociological research. The expanded affective lexicons that can be produced with BERTNN cater to shifting cultural meanings and diverse subgroups, demonstrating the potential of computational linguistics to enrich the measurement tools in sociological research. This article underscores the novelty and significance of BERTNN in the broader context of sociological methodology. Citation: Sociological Methodology PubDate: 2024-07-25T10:32:52Z DOI: 10.1177/00811750241260729